Ken Thomas - Instructional Designer | According to Ken... (Blog Posts)



According to Ken...

Ken doesn't currently run a blogging site. The following posts are selections from his contributions to various discussions across different Linkin groups he participates in.

Screen Text and Audio

from Linkedin: Instructional Design & E-Learning Professionals' Group - June 2012

Kathleen:

If the voice/audio should not simply have the repeat the words on the computer screen, what kind of content information should it relate? I tend to relate specific examples. What do you do? In e-learning blogs I've repeatedly seen "Don't simply have the audio repeat the words on the screen." I dislike negative instructions that say what to avoid without saying what should be done. I have yet to see an adequate description of the kind of content to include in text and audio in order to vary the two. I tend to relate specific examples in a type of story telling mode for the audio. What do you do?

Ken:

Well, here's what I do:

The audio is a tight script of "natural" prose - an informal tone written and read as though the narrator was talking directly to the learner in a one-on-one setting.

As the audio plays, supportive images and bulleted "key points" of text are added to the screen in synch with the audio. Here's an example that I think gets it right -- maybe not for a corporate learning, but good use of animation, graphics, audio, and screen text:

http://teacher.scholastic.com/activities/studyjams/water_cycle/

Good luck!

Kathleen:

Ken, I know this is off topic of my original discussion, but let me know if this is what you are saying. Basically you indicate that the best design for learning (for a particular concept) is the best for everyone. This is because we all learn the same way, and it doesn't matter what our personal preferences or perception of the course is.

Ken:

Kathleen - As for there's one best way... no, that's not position. My position is that there are hundreds of ways to approach a training intervention - one may work better for one learner, another may work better for a different learner, etc... building all the ways to accommodate all the learners individually isn't responsible, and relying on learner preference for how to present the learning is somewhat like relying on your children's preference for what to serve for dinner... in my house, every meal would be pizza and Twinkies with cherry soda. What I'm saying is that instructional designer must be well versed in current learning theory, current technology, and current design tools -- the designer must also know their audience and what will "work" for that audience. Here's the sticking point of learner preference - I say "good design is good design," and I mean that if your learners say that they'd like to have a button that turns on an audio version of the same text that's on the screen, it's my job to educate the client that is not instructionally sound and back that up with research as needed, and show them an alternative that actually works. Is my way "the only way?" No - it's not. There are hundreds of valid and excellent ways to approach any subject and meet any objective - but there are also hundreds of invalid and poor ways, as well. I rarely find it worth spending the client's money on more than one way - so my responsibility is to pick and sell the idea I think best meets the need - ultimately meet the objective for the audience. This is where I'm saying know the audience, as that enters the equation, but does NOT drive the solution.

Hope that clarifies it.

As for your original question, I already recommended two books that hit what you're asking: Efficiency in Learning (Clark, Nguyen, & Sweller) and Graphics for Learning (Clark & Lyons). You'll find Mayer heavily referenced in both of these texts, as well. Both are loaded with research and offer sound instructional guidelines. I don't believe you'll find an absolute "do this and then this" level of steps for your situation, but I still believe the link I posted earlier was an excellent solid real-life example of what the end product should resemble in leveraging text and audio.


Essentials Qualities of an Instructional Designer

from Linkedin: Instructional Design & E-Learning Professionals' Group - June 2012

Krishna:

What do you think are the essential qualities to become an Instructional Designer?

Ken:

Few "real world" ID projects run the way they do in the text books... clients change their minds, SMEs provide incorrect information, budgets get cut, deadlines move, etc. For this reason, the number one quality I look for is "adaptable" - I want a team member who can see each setback as an opportunity, and who can constantly remap a path to the end goal.

BTW - I'm in disagreement with the quality of knowing the subject matter. To me, a true ID is an expert at working with SMEs to define objectives, then gain a working knowledge of the subject - enough to create appropriate content maps an instructional strategies. Does subject matter help? Sure! But I see too many companies who promote an SME into an ID position because they have the subject matter knowledge - this usually ends badly.


Resource Reommendations for Moving a Team from ILT to eLearning

from Linkedin: Instructional Design & E-Learning Professionals' Group - June 2012

Tamsin:

Can anyone recommend any books on moving my team from development of face to face training materials to turning it into e-learning material effectively?

Ken:

Reading a bookshelf of books won't get you where you're going. You need a partner who can coach and guide you through the transition, setting up your style guides, templates, production pathways, and libraries... there's no single book or even set of books that go from soup to nuts on this.

I recommend the following approach, if you can afford it. Partner with an e-Learning vendor who can get you set up with the appropriate tools for your organization (even if it's a long term plan). Build a strategy where they will work with you to design your production pathway from design planning to LMS tracking and reporting. For the first year, maybe they develop 90% of your courseware while your team learns the ropes - the second year, 80/20 - the third year, 50/50 - etc. until they are only needed in a coaching role. The vendor can shift to a role where they're reviewing your design plans and maybe even storyboards - then your team can build the "simple" pages and let the vendor develop more complex ones (then show your team how it's done when they're ready).

I know this sounds like a lot, but trust me - there's no reason to recreate wheels and learn from your own mistakes. Also, without this approach you are almost certain to develop very poor materials your first year out, which will in turn lose credibility in your team and in the approach (trust me - I've been brought in several times to fix this exact scenario).


Designing with PowerPoint

from Linkedin: Instructional Design & E-Learning Professionals' Group - April 2012

Tara:

The company I am working for has decided that we will build our e-learning tutorials inside powerpoint. I have not developed in PowerPoint for quite some time, are there any good resources out there that could quickly catch me up on how to keep them engaging and interesting? I don't want the tutorials to look like a powerpoint, any other tools you would suggest looking it to?

Ken:

Good design is good design. Define your objectives and figure out the best way to meet them, THEN figure out the best way to accomplish this within your limits.

90% of what's been done in PP sucks and 85% of what's been done in Articulate and Captivate suck, but that's a limit of the of people creating the crap. Explore what outliers have built (other posters provided good sites).

PP is a surprisingly powerful tool for creating CBT - its weakness is tracking and reporting (Articulate can help somewhat).

Bring in a "power user" to build templates with you and show you what's possible.


Level 2 Scores

from Linkedin: Instructional Design & E-Learning Professionals' Group - June 2012

Micky:

I'm ignoring low L2 scores...do you too? We have some CBTs and instructore-led assessments where a few questions are stuck between 67-88% correct (below our target baselines).

We are going to ignore these low scores for now, because participants are performing correctly in the workplace. Because of workloads, I don't want to do busy work updating the materials if the outcome on the workplace is hitting the desired metrics!

Ken:

Part 1:

Explore readings on "item analysis" - pay attention most to predictive validity.

If high performing employees score WELL on a specific item and poor performers get the same item wrong, you have achieved the desired result - you've discovered a "differentiator" or a predictive item that can help you identify who will likely do well in the field.

Part 2:

However, if your high performers are getting the item wrong, you've got a problem... one that should not be ignored. Do the following - get SMEs to review the objective (should the entire objective be discarded?), the instruction (is it being covered correctly and accurately?), and the assessment item (is the item a clear indicator of mastery?) -- also consider a small focus group or a couple of interviews to have your learners explain their rationale for their answers along with their thought process.

So in closing, no - don't ignore your level 2 data.

Good luck!!


Practice, Practice, Practice

from Linkedin: Instructional Design Professional Group - June 2012

Joy:

I just had a trainer tell me "we don't have time for people to practice."

I would like opinions... I just had a trainer tell me "we don't have time for people to practice." Now, how do you explain the importance of practice in training? I did a quick google search but it didn't yield what I was looking for. Can anyone point me towards articles or books that explain this so I have something to back this up with.

Ken:

I really like "rehearsal opportunities" on the job... as a constructivist, I can tell you that nothing would be more effective to encourage and support "transfer of learning" more than on the job rehearsal of the desired behavior.

In studies going back to the 1970s, repetition has been compared to meaningful rehearsal in cued recall and recognition... which are basically necessary components of on the job performance (i.e., recognizing WHEN to perform the new or changed task or behavior). What better way to get your brain recognizing when to apply the new learning than to do it on the job (in comparison to "pretend you're on the job" scenarios in the classroom).

I wish I could attend your workshop - good luck!!


QA Approach?

from Linkedin: Instructional Design Professional Group - June 2012

Micky:

@Tina, that's how my company operates. We have SME's who QA the content for quality/accuracy/etc. and then an editing group who performs language/grammar/typo changes as needed. We don't have them edit until after theSME's QA the materials.

Tina:

My main objective: ID companies need to include an editing process as part of the development phase, and hire editors to work as language and grammar SMEs. QA could then truly focus on quality assurance.

Kathy:

@Tina, that's how my company operates. We have SME's who QA the content for quality/accuracy/etc. and then an editing group who performs language/grammar/typo changes as needed. We don't have them edit until after theSME's QA the materials.

Ken:

Tina - I'd only recommend caution about thinking of building ID products as an assembly line. If Keebler had to make every individual cookie in the box to different specifications, those elves who live in the hollow tree would have a different approach to QA. That having been said, I agree there's plenty of room in an ID shop to bring Six Sigma principles to the table - I've just seen too many creative shops turn into assembly lines pumping out cookie-cutter approaches to increase efficiency and profit, while crushing creativity and ultimately driving out the talent that made the group successful in the first place.

Kathy - I think that's the right application and probably represents what Tina is saying.

Ciao.


Do You Storyboard?

from Linkedin: Instructional Systems Design Professional - June 2012

Roy:

When developing Elearning courses, do you create storyboards/scripts? For what purpose(s)?

  1. No, we do not use Storyboards
  2. Yes, for graphic look
  3. Yes, for content discussion
  4. Yes, for content review

Ken:

I couldn't really answer this... I typically don't use storyboards anymore, but do for 1) new clients, 2) clients with large dispersed review teams, and 3) complex animations or interactions (where I want sign off before developing).

More typically, my clients will sign off on a detailed design plan for content and treatment, then get an "alpha" version of the courseware (all screen text, rough audio, all clipart, some final art, rough draft animations, etc.), and then beta (including all comments from Alpha and finished images and audio), and finally "final."


Adobe

from Linkedin: ISD Consultants and Contractors - February 2012

Louise:

Down the Adobe rabbit hole

Can anyone explain the difference between Connect elearning feature (they say it does asynchronous) and Captivate?

Ken:

Louise - you have every right to be confused, especially if you've been using Adobe to explain the capabilities - you really have to read the fine print to figure out that you don't ACTUALLY create your content with Connect:

"Easily create and deploy custom training programs that mix and reuse a variety of training assets and activities. Enable nontechnical subject matter experts to create self-paced, on-demand courses directly from Microsoft PowerPoint using Adobe Presenter software. Add voice-over narration and multimedia content to your presentations also using Presenter, or capture screen recordings and create interactive simulations and how-to demos using Adobe Captivate® software."

So what you're REALLY doing with Connect is creating a "home page" for your learners to access selected content you've created with other tools -- this is the "mini LMS" experience that John explained.

Basically, a main use of Connect is creating a virtual classroom experience - similar in many ways to the tools a lot of online colleges are using to create a single place for students in a course to access their assignments, launch self-paced eLearning modules, access scanned articles, and breakout into groups for discussions (these breakout rooms are just threaded discussion groups - this is the "asynchronous" aspect you asked about).

So as John pointed out, you'll need other tools to create your content, and you can use Connect to create an access point and virtual classroom.

For a better (better than Adobe's) description of Adobe Connect (including the web conferencing features I didn't get into), see the following review:

http://web-conferencing-services.toptenreviews.com/adobe-connect-pro-review.html

Good luck!!

Ken:

Macromedia used to own the eLearning tools (Authorware, Dreamweaver, and Captivate) and Adobe owned all the graphics programs (Photoshop & Illustrator). Adobe's site has always been complicated and unclear (although BEAUTIFUL), while Macromedia's site was almost utilitarian (but quick and easy to navigate).

Adobe's acquisition of Macromedia was great for so many reasons... but the wrong team ended up owning their advertising. Holy smokes! I have DSL and it takes almost a minute for some of their pages to load, and don't get me started on their insane packaging strategies! I have yet to find a package that meets all my needs, and I really can't throw down the money for the "EVERYTHING" package.

Oh well... a friend of mine said it best in a coaching moment - "Ken, they won - you lost. Learn their ways, buy their products, and shut up." I still vent about their web experience and figuring out their product line, but honestly, it's a great set of tools. Learn them, love them, use them. ;-)

As for selecting Adobe Connect as an LMS, or selecting ANY LMS, don't start by selecting the tool... that's a path to disaster (as a friend of mine said in a meeting, "Sometimes you just can't aford 'free.'"). Start by performing analyses to determine your needs - then create a set of requirements for your LMS (or any tool, for that matter). Once you have your requirements defined (I like a four section set of requirements - "Need to Have," "Would Like to Have," "Would Like to Avoid," "Cannot Have"), you're ready to start evaluating your tools, including your LMS.

I've been on LMS selection and implementation teams for 3 large companies - and I've seen the process go south quickly by taking short cuts.

Ken:

My painful example:

One of my companies wanted an LMS to track and report learner attendance and performance on a multi-week new hire training program (not an unreasonable request). The "winner" ran a demo showing how easy it was to do just that. We also wanted to run reports to show a learner's score in a series of new hire tests and compare those scores to their performance after 1 month, 3 months, 6 months, and 12 months on the job - again, connecting to our company's HR systems, that would be a breeze with customized reports (design the report, save it, then run it whenever you like). Finally, we wanted to cluster information like instructor and site to sort that data (i.e., is any one instructor or site better/worse at selecting, training, and coaching new hires - then we could explore best/worst practices for the company) - of course, a tweak to the reporting, and "viola!" - no problem.

We bought the tool (over $1million), spent a year implementing it (internal team of 15 people full-time dedicated to the project), got the puppy up and running (yay!)... then found out we couldn't track a multi-day or multi-score "event"... which meant that in order to even get the data INTO the system we needed, the Facilitator had to create a separate event for each day of class in order to track attendance (btw, no "templates" or clustering of learners - each learner had to be individually added to the day event), and then each formative test had to be created as yet a separate event again... so an instructor had to spend an hour and a half of each day basically creating a new roster for the day and entering scores... then when we actually went to run the reports, the reporting tool was not able to access the data that was stored in the system.

When we met with the vendor to express our frustration (i.e., this was in our requirements and you even demonstrated this as a basic function of your tool!), we were told they would have to customize our version of the LMS - for an additional $250K. We actually spent that money - and the customized LMS was still unable to perform the functionality. Finally, we came up with a strategy that would work and the LMS company released a new version - which our license said we'd get for free (all upgrades free of charge for 5 years), HOWEVER, since we had customized the software, we couldn't use the upgrades until they, too, were customized for us... another $100K for that.

Blah, blah, blah - it was good money after bad, and this was one of the major players out there (I won't say which for SEVERAL reasons).

The leads on this project were not fired - not punished. In fact, they got big bonuses for their accomplishments. When the LMS contract came up for re-bid, they went through the same selection process as before and ended up buying the same LMS as before - they stated their main reason was, "A good rapport with the vendor and the vendor's vast experience in our special environment." I swear, you just can't make this stuff up.

BTW - disclaimer - I used to work for a company that designed LMSs, and I worked with the team who designed and developed the initial AICC standards. When that world got too big, we pulled out and focused on designing and developing training, and leveraged our own proprietary LCMS, which could be customized to communicate via AICC (obviously) and SCORM LMSs.


SMEness?

from Linkedin: ISD Consultants and Contractors - November 2011

Theresa:

Looking for an ISD Consultant that has experience developing statistic courses

Dan:

Why is it that ISDs are expected to be SMEs? I've successfully produced materials that train doctors to operate and I'm not a physician; pilots to fly, and I'm not one; mother's to breastfeed, and I know that is not a possibility for me (male), along with a ton of other successful courses. I'm an ISD - I don't care what the content is, the process works regardless. If you are really looking for success with whatever training is being built, wouldn't you rather have someone who knows how to build successful courses? It sounds like you already have a lot of the SME requirement on-board!

Ken:

While I respect Dan's position (i.e., if an expert in ID is properly paired with SMEs, they're capable of designing and building training in virtually any subject), the request was clear enough. Having experience building courses in statistics will reduce the time it takes the designer to get up to speed.

If I were currently available, I'd love to throw my hat in the ring - in the meantime, I recommend the following overall words of advice...

I've built statistical analysis courses and workshops for HUD and offer the following general advice on any statistical course - keep the scenarios relevant and always strive to keep your problems concrete and applicable to your learners' environment. When statistics is taught in college, it's often taught as abstract or in word problems that cannot easily be leveraged in the "real world" (unless your job really is buying oranges and apples).

One audience at HUD, for example, was the group who analyze lenders' records to determine if minorities are being treated fairly. As their final test, they were given a collection of real data from several loan officers (we changed the names), and they had to select the appropriate strategies to analyze the data, then they had to apply those strategies... ultimately, this was a direct match to their job. The problems and activities leading up to that scaffolded to that end test.

Good luck!

Dan:

The questions really becomes how important is this training to those who are asking for it? What I've found in situations just like you are describing - SMEs without enough time to contribute to an ISD effort - is that when the ISD delivers the material they created for review, it is not liked and then much more effort is put into the project re-write than would have been used if an SME would have helped to begin with! Except now you have a bunch of different people with different ideas of how to solve the issue/problem at hand creating materials that are different and all expecting you to use their effort since they took the time to respond!

Ken:

@Dan - my read (Theresa correct me if I'm wrong) is that the SMEs don't have time to get a non-statistics ID up to speed on the concepts and math of the statistics, not that they won't have time to review and sign off on objectives, design plans, storyboards, etc. For this reason, they've decided to jump-start the project by hiring someone who doesn't need to "get up to speed."

I worked on a project that required in-depth engineering knowledge on the part of the designer - I was not qualified to be that designer, but was able to meet a couple of times to brainstorm treatment... when I saw the content, I was glad I didn't draw that straw. I'm not saying you need to be a SME to design training, but I do believe there are topics where a certain level of SME-ness is a great asset, and other topics where if the Designer cannot grasp the material the project is in jeopardy (no matter how available the SMEs are).

Ciao.


Contracting and Commitments

from Linkedin: ISD Consultants and Contractors - November 2011

Dan:

Contract Work Question

I'm new to the consultant world and need some advice. I’m pursuing both contract and full time work preferring full time.

I’m not sure how to handle the situation if I get a contract job then get a full time offer before the contract is up. Is a two week notice appropriate? I don’t want to burn any bridges because in the future I want to primarily do contract work.

What have others done when this situation occurs?

Steve:

In my experience 2 weeks is good enough. Your contract has, by definition, a fixed time limit and most employes understand that you always be looking for the 'next' opportunity.

Ken:

Few things suck more than starting a new job while finishing an old one! Here are some thoughts:

1) Consider whether you want to keep an open door to the contracting world (and will this be one of your future return clients) - this may guide how you want to handle the situation.

2) Consider how "in the lurch" you'd be leaving your current client. If your leaving will screw your current client, refer to thought #1 - you may need to either negotiate a fee differential or some "on-the-house" transition time to help a replacement resource come up to speed.

3) If you're at a good transition point or one's on the horizon, have an open discussion with the client - like Susan pointed out, most people are actually quite realistic about full time opportunities (it's a completely different story if you're dumping them for another client!!!).

4) Taking thoughts #1 & #2 into consideration, consider having a discussion with your new employer - possibly a soft start date or some flexibility for a set time period. Many of my peers have disagreed with this strategy, but here's my thinking - how you exit your current responsibilities is a pretty good indicator of what your new manager can expect when you leave them. I was ready to close the deal with a prospect - when I asked "when can you start?" he responded that he could start tomorrow, laughing at how that would screw his current project... bad call, dude - I did a 180 on him right there and then.

After all that, Steve is certainly right - if you can't reach a reasonable strategy to satisfy everyone, 2 weeks is good enough - you have to look after your own health and sanity ESPECIALLY when starting a new gig. Susan adds some great follow-up points. Bottom line, your current client shouldn't feel screwed (at the very least they should recognize you've done everything you could to be open, honest, and fair) and your new employer shouldn't feel you're not bringing your A game to the table.


Unusual Assignments

from Linkedin: ISD Consultants and Contractors - May 2011

Kevin:

What is the most unusual thing you've ever created training for?

For me, it was creating training for undercover narcotics investigations...in other words, training narcs. Long hair, bearded, foul language using, macho, SMEs. How about you? Was there something you trained on that if you tell us, you'll have to kill us?

Ken:

Safer Sex Workshop. Four hours of very colorful discussion and some hands-on activities that left participants breathless. I had to pack a duffle bag with about 20 dildos, lube, and a thousand condoms. I used to play, "I'll give you $20 if you can guess what I have in this bag" - I never lost a round.

The strangest part of that workshop was the T3 - all the facilitators had to get together and get "desensitized" to any question that may come up (e.g., "what's my risk level for doing abc? what about xyz?") - I've never been the same since.

Another course I designed and facilitated was part of the Drug-Free Workplace program back in the 80s for various government agencies. The training itself wasn't that unusual, but we had a great video we used that was absolutely surreal - Howard Hesseman (of The Committe improv troop - later known as Johnny Fever on WKRP) had created a video called "The Drug Tape," where he talked in depth about various drugs and would demonstrate the effects of the drug. So, for example, he talked about heroine, then simulated taking it and going through a heroine trip. THAT was awesome.


Evaluation - Behavior Change vs. Business Impact

from Linkedin: Organization Development & Training Forum - June 2012

Greg:

Am designing a module on training evaluation. Need help in clearly bringing out the differences between Behaviour Change and Business Impact (Kirkpatrick)

Ken:

Greg - The original question seems well answered. As for your second question (at what level of evaluation do you peg you learning interventions?), my answer is: it depends...

If I'm training Parachute Packers how to properly pack a parachute, I don't care about their Level 1, I demand a 100% on Level 2, a 100% on Level 3, and I don't care about Level 4. In other words, I don't care if they like the training and I don't care if we made money - I just want to make sure they really know their job. (Yes, of course at SOME level I really do care about Level 1 and 4, but I'm just making my point.)

If, on the other hand, I'm providing training on our new dress code, my numbers are going to be much lower... maybe even "attendence" or "acknowledged" will be fine (MY boss will probably only care about Level 3 - are they following the new dress code?).

The real strategy is to understand the outcomes of your training, and only gather the data critical to your objective (i.e., I don't think you have to gather Level 4 on every training).

Greg:

Hi Ken - like what you've written, thanks. I guess the full form of ID is It Depends.....

I was reading some where that Kirkpatrick is now being viewed not as a hierarchy, but as:'what is the business impact - to achieve that impact what is the requried behaviour change and to make this behaviour change happen what is the kind of reaction we need from the participants etc...

Ken:

Greg - I LOVE "It Depends" - I'm TOTALLY stealing that from you!!

As for the looking backward, I've been lucky enough to meet Don Kirkpatrick and actually had lunch with him when he was still practicing. I think I can say that's not how he sees the model, and it's not how I view it.

I only need one case to prove my point, right? I'll run through a couple to beat the dead horse:

1) If I rollout an unpopular program that makes the company a lot of money, I could have TERRIBLE level 1 scores, yet very high Level 4s.

2) If I rollout a new policy course that's required by law (like HIPAA several years ago), my level 4 is weak... compliance is rarely about "making money," it's about either keeping your license, not being fined, reducing the impact of a negative judgement, etc. So, I could have great Level 1, Level 2, and Level 3, yet an expectedly poor Level 4... but since "staying legal" wasn't going to drive revenue, I knew the most important aspect was actually Level 3 -- when we're audited, are my people doing the required behavior to comply with the law. I wouldn't say that Level 1 really "drives" Level 2... you can probably make more of a case for a well designed Level 2 having strong corelation to Level 3, and then Level 4 may be dependent on Level 3, but "it depends" (see how I brought that back?).

This is in alignment with the conversation I had with Don, and feel he would agree. Each Level meets a specific need and has a specific function - they're not hierarchical other than the desired relationship between Levels 2 & 3.

Screenshots

from Linkedin: Organization Development & Training Forum - April 2012

Clint:

What are your views on using step-by-step instructions and screenshots in Participant Guides for certification courses?

The course I am currently working on is an intermediate certification course for computer software. To start out we will be training current partners that have been with the company several years, but the course will be used for all new partners as well. Some of the processes use a Wizard while others require the end user to navigate to advanced options. Are screenshots necessary and enhance the training or do they distract from it?

Ken:

I'm not a fan of this outdated approach, but not because it distracts from training... here are my thoughts...

Step by step procedures belong in an online help or Knowledge Management system, where they can be accessed on the job while performing the task (and where they can be immediately revised and instantly available to all who are using them). By putting step-by-step procedures in a printed guide, you're asking the learner to memorize the steps... this will ultimately lead to task failure.

Screen shots in a printed resource should be limited to screen introductions and "un-obvious" aspects of the screen design (i.e., "Then click the OK button" is not going to be enhanced in ANY way with the graphic of a screen with the OK button highlighted).

Procedures in a printed resource should be limited to concepts - or flows... instead of the detailed clicks and field entries, imagine a procedural flow like this for handling a delinquent payment:

Detailed steps (in Knowledge Management System):

Step 1: Find the customer's account in BILLING SYSTEM.
Step 2: Access the Account Summary screen.
Step 3: Find the total amount the customer owes in the Total Due field.
Step 4: Find the amount of that total that is delinquent in the Past Due field.
Step 5: Offer the customer the option of either paying the total amount or just the amount past due.
Step 6: Use the Credit Card Payment process to collect the payment.

In this example, I might include one screen shot of the Account Summary screen with the key fields highlighted.

What I'd put in the Participant Guide:

When handling a delinquent bill call:
1) Figure out how much the customer owes now.
2) Figure out how much of that is past due.
3) Offer the customer the option of paying the total due or just the amount past due.
4) Process the payment.

The Participant Guide would then include a couple of scenarios to work through (although I'd still rather handle those via online simulation).

BTW, if you do plan to use screen shots in printed material, I recommend a) using Snag-It (it is the best tool I've found so far to manage your screen shots) and b) saving your shots in PNG format.

Remember, your screen shots will be native to 96dpi - print quality is typically 600dpi. That's why many screen shots appear fuzzy in printed materials. GIFs reduce to 72dpi (I believe) and JPGs are "lossy" (they're great for photos, but terrible for screen captures). PNG is a loss-less file format, appropriate for online and print.