November 16, 2010

TRISH REDMON: "Using the IQ-ATR, the

Indicators of Quality for Assistive Technology Reuse, to

Improve the Reuse Program."

I'm Trish Redmon, one of the participants in

developing the indicators of quality. But as I said

before, Lindsey Bean Kampwerth, our colleague in

St. Louis, is the person responsible for our project,

and she can't be with us this afternoon unless she gets

out of her appointment at the last minute. So we'll be

walking through the presentation that we prepared


I want to thank you all for participating

today. If you have not already done so, you may wish to

take a look at some documents later that are loaded to

our knowledge base on the Pass It On Center website that

are going to give you a form you may need to participate

in a little giveaway.

As an incentive to use the online program

assessment tool, we're going to give three $50 gift

cards away to programs who tell us how they've made use

of the online tool after this webinar, but we'll talk

about that a little more at the end.

We want to thank Caroline Van Howe and the

Assistive Technology Industry Association for providing

the technology that makes this webinar series possible.

This webinar will be recorded and transcribed

by Kimberly Griffin. The slides, the audio recording,

and the transcript will be made available on the Pass It

On Center website in a few weeks.

ATIA has been a wonderful partner on this and

in providing a national venue twice yearly for a strand

of sessions on AT reuse, and starting last year, for an

all-day preconference workshop on improving reuse


Caroline Van Howe is with us from ATIA today,

and we have an upcoming session in Orlando, and I'm

going to give Caroline an opportunity to talk for a

moment about that conference.

CAROLINE VAN HOWE: Thank you very much,


This is Caroline Van Howe with ATIA as Trish

mentioned. I'd just like to welcome you to the webinar.

I'd also like to give you a little bit more information

about the AT reuse brand that is one of the educational

strands at the ATIA conference.

There are over 200 educational sessions at the

conference, which takes place in January in Orlando,

January 26th through 29th. And part of the support that

we provide to the AT reuse team at the Pass It On Center

is that they are doing an all-day preconference seminar,

Beginning with the End In Mind Tips and Techniques for

Building, Maintaining, and also Expanding AT reuse

programs. We're pleased to announce it now at this

webinar, and I think it is being announced by the Pass

It On Center as well.

Now, there is a special discount code for a

50 percent discount on the preconference fee, which is

$275. So that's the preconference for $137.50. The

code is "pass pre 50", as it appears on the public-chat

area, and also it would be at the Pass It On Center.

So this is a special, as I said 50 percent

discount, offered to participants of this webinar and

also to the grantees of the Pass It On Center Project.

I would be happy to take any other questions,

if anyone has any later on, on the ATIA Conference, but

now, I'd like to hand it back to Trish.

TRISH REDMON: Thank you, Caroline.

I'm having some technical difficulties, but

I'll get your screen back in just one moment.

In the meantime, if you're new to our

webinars, please take a moment to look at the right side

of your screen. The navigation panel contains the tools

that you will need to participate actively in the


First, please note the icon for speaker sound

at the bottom of your screen. You may need to turn up

the sound to hear the speakers clearly. If at any time

you're having difficulty hearing, you can type a note

into the comment section in the middle of your screen

above the list of participants. If you type there and

enter, then it will display in the public-chat area.

You may also use that area to submit questions to us at

any time during the webinar.

Let me get our slides back, and we will begin

in one moment.

If you would like to receive credits for this

webinar, you may do so through the AAC Institute. The

web address is on the slide.

I'm assuming everyone can see the slides now.

If you can't, please enter a note in the comment column.

Unfortunately, this particular webinar does

not qualify for CRCs.

Today's webinar is going to address the use of

our Indicators of Quality for Assistive Technology

reuse. These were designed in 2009 by a national group

of people from reuse programs and professionals in the


If you want to see the tool that we're going

to examine better, you can do so by actually opening a

separate window, perhaps, and looking at this web

address, the passitoncenter.org/IQATReuse. I'm going to

show you some screen shots of the tool when we get


Today our learning objectives are to

understand the Indicators of Quality for AT reuse as a

guide to promising practices for all programs. And we

want to explore some alternative methods for using the

online program assessment tool to aid program

improvement. We originally wrote the indicators of

quality and prepared the large-print document, and then

a few months later, we developed an actual online tool

to simplify using it as an assessment tool.

This tool does not evaluate you in numeric

terms, but it does present all the indicators, and gives

you a simple multiple-choice way to determine where you

might be on what we might call the maturity curve for a


Each of the indicators identifies the

circumstance for the design outcome. So we have an

indicator stated. We have a rationale for that


So today we'll look at these as a guide for

promising practices. We're going to explore some ways

for you to use this online tool to evaluate your program

or portions of your program. Each of the quality

indicators has key factors for consideration that allow

you to determine how close you come to the promising

practices that have been identified. So we'll look at

some of those things today, not the indicators

themselves. And as I said earlier, this was developed

in 2009 by representatives from programs by the Pass It

On Center and the National Task Force for AT reuse.

Now, the tools simply converse each indicator

into a quick multiple-choice assessment process. And we

didn't try to say, "Check each factor," because some

indicators have one or two factors, and some may have

eight or ten factors.

So as you'll see when you look at the pages,

the choices are that "This indicator doesn't even apply

to my program," so you check that and move on; or "My

program meets some of those indicators;" or "It meets

none of those indicators." So to use the tool, you

would go to the website. That's

www.passitoncenter.org/IQATReuse. And you create a

profile. If you want to use your account that retains

the information, you can create a profile. If you don't

want to create a user account, you can be anonymous,

like in as a guest.

The tool allows users to determine whether

they want to do the entire tool or one category or

several categories. And you can do all of those. And

one of the other advantages, and we'll talk about this a

little bit more, is it provides nonjudgmental results in

the form of references. It doesn't say, "Here is your

rating." It simply provides you resources for help.

Actually, after having posted this, Carolyn

Phillips, our project director, has had someone from

Kuwait to explore the indicators of quality and found

much of it useful and even had inquiries from someone in

Korea. So it's always surprising the breadth of the

exposure you have when you attempt to define something

this useful.

These are screen shots. If you go to the

website and go from the home page over to the reuse

site, this is what the home page of the online tool will

look like. And you can see that it gives you the option

to be home, to register, or to go back to an account

that you've created previously. And this simply gives

you directions for how to create that account.

What the account does is track what you have

already done. So if you choose to use the tool and

examine only one category, when you come back, it will

show that you've already completed that category and

show you the ones that you haven't done.

So after you've done that, the basic

directions for each category tells you that you're going

to view a quality indicator that defines the factors

that contribute to the promising practices, and then

you're to check the box that best describes the status

of your program at this time.

And that's where we have the "Does not apply;

it meets none of the conditions at present; it meets

some of them; or it meets all of them." And then you

begin the survey.

At that point, you will see the ten categories

of quality indicators, and they're quality indicators

for sustainability, program operations, human resources,

user services, organizational structure and governance,

management, supplier and manufacturer relationships,

marketing, accounting, and emergency preparedness.

So at this point you could choose which

category you would like to do first, any one of those in

any order. Each category then has the category area at

the top, the name of the indicator, the statement of the

quality indicator, a rationale for why we need this

quality indicator, and then the key factors for


And below that, you see your four choices, and

you simply select the button that describes where you

are now, submit that, and it goes to the next indicator

as soon as you submit. When you have finished an entire

category, you will be given a results page.

The results page reiterates the organizational

profile that you've entered when you set up your

account, if you did set up an account, and then based on

your responses, it provides a list of resources. And

for each indicator, it will give you some feedback.

For example, on the Web Exchange Services

Indicator that appears at the top of this one, it says

"best practices met." And so if best practices are met,

you won't have any resources listed.

If you have not met all the best practices,

then this results page will give you a listing of

resources, and most of those will be in the Pass It On

Center knowledge base on this website. Some of them are

external resources and may be listed. They may be

books, publications, other websites that are useful.

This is simply another results page, and you

see lists and a reiteration for a different category.

When you come back to the page after

completing a category, you can see that several

categories on this screen shot show that the user has

completed these categories.

One category, marketing, is only partially

completed, and three other categories have not been

started. So that's the advantage of creating the

account. No one sees your account information except

you. You create the account, and you can come back to

it and resume wherever you were.

You can also unclick these categories and take

the survey over at a different point to compare your

perceptions or results for the survey.

The purpose today is to talk about how you

would use this tool to make it more effective for you.

It's intended to be a way to assess where you are

perhaps on the maturity curve of a program or what you

need to do if you're a start-up program, or simply a way

to identify the specific focus that one area of your

program may need. So as with everything else, we start

out by identifying what our priorities are.

Now, many people have the privilege, as I do

with the Pass It On Center, of working with people you

just enjoy being around all the time, that you learn

from; there's no tension; no one is defensive about

anything. But I can assure you that I've lived in many

environments where that's not true.

So one advantage of having a tool like this

may be to depersonalize the change in a program. If you

have a program that needs significant change or

improvement to address, then you can use the IQ-ATR and

the online tool because it represents the judgments of

professionals in AT reuse programs about promising

practices. It doesn't represent the opinions of the

leader of the program or of the founder of the program

or someone else.

And so you just can take all the personality

out of "What are we going to evaluate ourselves in this

manner?" and say, "This is an external objective

foundation for assessing where we are now." You can use

this to defuse the potential objections to the need in

change, take those traditions and personalities off the

table, and sit down and have an open discussion about

what you can do to improve the program. Because the

goal here is really for all of us, what do we need to do

to build a sustainable program to continue to fulfill

the mission we defined at the outset?

One simple use for IQ-ATR, even if you don't

have meetings about this, is to use the quality

indicators to determine whether you have a complete set

of policies and procedures for your program and all of

its activities. And if you have the complete set of

policies and procedures, "Do those things that need to

be done appear on someone's job description in your


It's one thing to say that, "We do check

recalls and warnings on devices," but as Sara Sack

pointed out to us at our conference last year -- she put

this on someone's job description. "It is your

responsibility to ensure that FDA recalls and warnings

are addressed, that we find those customers, and that we

notice them."

So there are many things like that in terms of

using this. And today we're just going to explore how

we can use the tool with different levels -- with

individuals, with internal groups, perhaps even with

external groups.

As a user, you can sit down and take this

survey alone to decide how you think your program

measures up to these defined indicators of quality.

That's a good introduction. You may find that some or

all of the categories of the indicators of quality are

good introductions for orienting new employees or


If you're familiar with the indicators of

quality and want to benchmark your progress or

improvements, you can use the online program assessment

tool with other groups or other individuals. And this

is one way to leverage discussion and goal setting for

one area or for an entire program.

Starting with one group does not limit the

use. You can create specific working groups to include

all staff and volunteers so that you can create what

Malcolm Gladwell would call an epidemic of quality

indicators so that you're spreading what you're trying

to do.

We want to take a project approach in trying

to deal with the improvement that needs to be done. And

so we do an assessment and determine which things apply

to us and assess our status, evaluate our results, and

create some task lists, identify some priorities, and

then identify a plan. And those plans, if we want them

to be carried through to fruition, are going to include

assigning responsibility and devising timelines. And

then we want a way to monitor our progress. So all the

improvements are project plans, and this is just a tool

to get us there.

So let's look at some of the different things

we could do with this. As a manager or key employee,

you might want to walk through the tool alone, if you've

never looked at the quality indicator, to just get a

sense of where you think your program is. Walk through

and assess your own program and frame your own ideas for

improvement. Or as the leader, you may want to meet

with one key manager or key employee to review the

indicators of quality for only one category. Perhaps

you want to walk through only your user-services

category to see how you're interfacing with customers,

how you're doing on the follow-up. Are you addressing

the things that other people have identified as

promising practices? So that becomes a neutral

springboard for discussion with one other person, in

that sense.

Now, if we want to go beyond individuals, we

can look at using it within internal groups. And I

would expect that most of the time you would use it with

external groups because this is where our focus is --

"How well are we doing our job?" But as we'll see in a

minute, there are some other opportunities.

You have managers. If you want to meet with

all your managers and identify your concerns by walking

through the quality indicators and saying, "How are we

doing in all of these areas?" and then create process

improvement plans, that's one way.

You can take it in a different way and just

focus on a single department and all the employees or

all the functional teams who do one thing -- perhaps all

the people who work in program operations, whether

they're employees or volunteers or contractors? Maybe

you want to talk about, "This is the goal. We want to

be able to say that we meet all of these indicators."

Or if you want to address work flow, maybe you want to

deal with cross-functional teams and have all of the

people who are involved in acquiring donated equipment,

repairing, refurbishing, sanitizing that equipment,

identifying how to reassign it, tracking inventory, and

delivering that, you can do a work flow with a

cross-functional team. So there are lots of ways to

take in trial groups and use the tool to acquaint them

with what the expectations are.

Sara Sack at Assistive Technology for Kansans

used the tool in a rather different way. She asked six

members of the Kansas team to take the assessment

independently, and then the group met and discussed the

differences in their ratings and why they had different

perceptions about where they stood on those factors for

consideration. So that's an even different way to

approach using the tool.

External groups are probably a little

touchier, and I'm not going to sit here today and tell

you I think you should rush out with the indicators of

quality and walk through them all with an external group

without putting a great deal of thought into it. But

there are some interesting possibilities for using the

indicators with external groups.

If you've considered carefully how to use it,

it's an opportunity to have a candid examination of

where you stand if you want to use it with people who

are really committed to helping you improve your

program. Again, you can say, "This tool offers

externally identified standards for successful programs.

We want to be a sustainable program, so we want to meet

those externally identified practices, and we want to

talk about how to get there."

Maybe you need some additional financial

support to get there. That may be in the form of

equipment, facilities, larger staff. But if you want

more support, maybe you really want to share this with

the Board, or maybe you want to choose one category and

go through with your Board and say, "This is our

priority this year. We really want to focus on

improving our marketing, and here are some of the things

we need to do," or "We want to improve our program

operations because we're not meeting some of these

indicators of quality." So you could do that.

The same is true of your advisory council. If

you're not incorporated and have a board of directors

but you're in a governmental organization, you have an

advisory council, you may want to do the same kind of

thing with your advisory council. If you have

supporters that you really feel comfortable enough with

to expose your vulnerabilities, then you really may want

to sit down and use the tool to spur the discussion of

needs and talk about how to promote your sustainability.

It could be a great springboard for discussion but

something you might have some concern about.

Do any of you have any comments or suggestions

about the advisability of using something like this with

external groups? Okay.

Another suggestion was that you could combine

this with your state program. This is not part of the

formal program review for state AT Act programs, but if

you chose to, you could combine the IQ-ATR with the

NATTAP quality indicators which go beyond reuse. These

indicators are all focused on reuse, but you could use

them as a supplementary tool for state programs.

One thought is that you combine the use of the

indicators of quality and the online tool with the

common business analysis tool called SWOT analysis. I

suspect most of you have seen this at some place. This

is strength, weaknesses, opportunities, and threats.

And strengths and weaknesses are internal

circumstances over which you have some degree of

control. Opportunities and threats are external. You

can't control them, but you do need to be able to

respond to them. So I have some suggestions about how

you might use these two together.

As you're going through the indicators of

quality, then for every one of those indicators where

you meet all of the factors for consideration, you've

obviously identified a strength for your program. And

if you've checked "meets none," then you've obviously

identified a weakness.

Now, there's several ways to capture this

information. You may want to capture this information

on something as simple as a flip chart. You may want to

use something higher end, like your computer, or if you

have smart boards, that would be great. In a working

group, this could be great. And so I think it's really

obvious that you could say, "Okay. If I meet all of

those, that's a strength." That does not mean you want

to ignore that. We'll talk about that in a minute.

What would you do with those things where you

"meet some"? I think that "meet some" could be another

list, possibly a "quick hits" list. But whether it's a

"quick hit" or not depends on whether it takes an

enormous amount of resources to get from "some" to

"all." If it takes little effort and minimal resources

to get from "some" to "all," then maybe those become a

priority for "quick hits" in your program.

But as we've managed to process improvement,

we've assessed the current status. We want to set goals

for improvement and develop our project plan. So if you

use those identified strengths, weaknesses, perhaps

quick hits, then what must be done to meet the promising

practices? Then you need to discuss with your team the

requirements, the benefits, the issues, and of using new

policies and procedures in changing how you're doing


So let's go back to strengths. We don't want

to ignore them. We want to capitalize on them. So if

you make a list of your current strengths, and then say,

"What could we possibly do to capitalize on this? This

is what makes our program better, distinctive, or very

effective. Maybe this is what makes it a really great

candidate for additional support from grants or

contracts or new partnerships with people."

So if we identify our strengths, we want to

capitalize on those. And you would want to set some

goals in your project plan that are associated with how

to capitalize on the strengths that you've identified.

At the same time, you want to remediate the

weaknesses. Some of those may make you more vulnerable

than others. If you have weaknesses that pose a serious

liability, then you'll really want to make a plan and

put those higher on your priority list.

So projects -- most of you are involved in

some kind of project management. We want to define

activities and tasks. We want to assign roles and

responsibilities and develop a timeline.

Projects are my favorite culture to work in.

Projects have beginnings and ends. They are not

committees. Committees go on forever. Projects

accomplish an objective, and they go away, and a new

project takes it place. It also gives people a lot of

opportunities to have broadened opportunities for

participation from your staff where nonsupervisory or

nonmanagerial people can have leadership roles that they

might not otherwise experience and gets people to build

additional skills and to grow and to learn. So

projects, even little projects, are great opportunities

for your program.

So track your progress. Once you decide which

of these indicators, you're really going to work toward

accomplishing and changing to a "meets all." Then you

want to have a review for this and a simple high-level

review for your progress against the schedule that you

created in this project plan or a way to identify the

ones that have unexpected delays or issues. And one

really simple way of doing that is a stoplight report.

Stoplight reports say, "If this is stuck, it's

in the red category;" you know, yellow means, "We're

delayed, but we're really working on it, making some

progress;" and green means "on track" or "ahead of

schedule." So this is a very simple approach to how to

use the tool.

We'd like to have your comments or suggestions

on how you think you could use the IQ-ATR to improve the

program or if you think some of these suggestions would

work for your program or what kind of issues you might

encounter or if you have comments or suggestions about

other ways to use the tool.

I'm going to stop for a minute and see what

people think about the possibilities.

JOY KNISKERN: This is Joy, and I'm wondering

who has had a chance to use this and what you've

discovered from that process.

TRISH REDMON: Diana, did you have a comment?

We can't hear you. Okay.

I assume that most of you have not tried this.

We really want to encourage you to try that and to see,

even if you only do this as an individual or if you

choose to do, you know, very limited interactions with a

manager or another key employee and compare their

impressions before you attempt to use it with a group.

That would be really interesting for us to know what we

could do to improve the tool.

As we developed the tool, we initially went

through some very complex consideration of how to

actually create ratings and decided that was not the

best starting point. But maybe you would like to see

something more complicated than what we have done.

At this point, Barclay, we don't know how many

people have completed it because we have not really --

we monitor how many people have created accounts, but we

have not looked at what they're doing. We don't intrude

into the individual accounts to see how much they've

completed. That's why we're asking people to try and

give us some feedback in a different way to see how

successful their efforts were at using it.

The other question we have is, "Are the PIOC

quality indicators the same as the RESNA quality


No, they are not.

JOY KNISKERN: Trish, this is Joy speaking,

and I also see that Brian Bard has commented on whether

or not we can open the IQ reuse assessment site directly

from the webinar. If that's a possibility, that would

be great.

TRISH REDMON: Caroline, I don't believe we

can do that in this webinar room, right? Can we? Okay.

I will attempt to do that then.

JOY KNISKERN: This is Joy.

While Trish is opening up the website, first

of all, we really appreciate everyone joining us here

today, and I'm kind of wondering if anybody here has

experimented with the IQ-ATR, and just really openly

sharing some of your experiences -- positive, not so

great, suggestions.

We're really open to your input because we are

continually looking at that and seeing if there are gaps

or needs and ways we need to improve it. So if any of

you have comments you'd like to share, that would be

wonderful. And I'm sure some of you out there have used

it and have experiences with that.

As you can see, we're looking for the page

right now. We do assure you that it is on our website,

and hopefully we can find that for you in a minute and

just walk through some of this directly. I think that

hands-on experience is always helpful and always useful

to all of us.

I see a comment from Barbara. Hello, Barbara.

I hope you're doing well in Nebraska. And you haven't

used it yet. I've heard some good things through one of

our program friends, Sara Sack, and work she's doing

with you and hope that you do get a chance to get up

there and take a look at it directly. And with any

luck, we'll be able to pull that up in just a minute


I don't want to put anybody on the spot, but

if anybody else has had a chance to -- here we go.

Good. I'll release the mic and let Trish kind of take

it away.

Does everybody see the welcome page of the


What you see when you open up on our

website -- and this is a larger view, a much more

visible view of just the description of how that works

when you get onto our website.

I'm going to turn this over to Trish, if

you're available. I don't know if you can move to

additional pages here.

TRISH REDMON: Before we move on, you're going

to tell us some basic information about your

organization; if you register and create an account, the

organization type, the geographic service area, the

population you serve, the age groups, and which reuse

activities you engage in.

We'll have to log in. I don't think I can

show you without logging in here. Let's see if we can

create an account here.

What do we have here? Let me try a different


I'm not forgetting my password; I'm just

trying to avoid giving you my administrative password.

Let's go back and see if I can just register as a guest

and not use my name here. Okay. It doesn't want to

allow me to do that.

So if you create an account -- I have already

created my account so you're not seeing the profile I

created, and of course, my account is a little phony

too. So my personal account to play with says I'm a

for-profit organization, which is true in private life;

I serve a multi-state region, all disabilities, and all

ages. And you can see from this that I have actually

been through every category, and it shows them all

completed. But if I wanted to continue the survey, I

could simply uncheck these and take the survey again and

start from scratch. So let's just do that, and let's

pick one category.

I've seen all the web page, so I don't see the

missing part, but let's just say we'll look at some of

the user services area. Let me take that away. "Begin

the survey, pick a category." Now, this is interesting.

Today it wants to tell me I've done everything, and I

can't do it over. Yes, it should allow me. Yes, here

we are.

So we are on the first page of User Services

Category. This is the first quality indicator.

"Equipment delivery to a customer," and this is the

statement of that indicator as formulated. The program

delivers or works with other groups or services to

deliver assigned devices to customers, and then we have

the rationale for that indicator. And then we have key

factors for consideration, and in this case, there are

only two. So "equipment deliveries provided or

arranged, if needed, anywhere in the program service

area and for all types of devices."

And my previous answers are still here,

obviously. I could change that, and say, "This does not

apply to me. I do no delivery," and go on to the next

quality indicator.

And the next one is "Matching devices to

customers. Appropriately trained professionals follow

documents and procedures to match customers to devices."

In this case, the program has documented procedures that

are based on standard professional practices, uses

professionals with appropriate training, ensures that

the device is consistent with the recommendations, and

so we say, "We need some of that." This is through a

category. And this category isn't very long, so I'll

click through the entire thing.

If you'll notice, when we get to Web Exchange

Services, we have a long laundry list of factors for

consideration. The Web Exchange Site User Protection

includes all of these things. That's a very long list,

and so we might say, "We need only some of those because

we really haven't implemented all the recommended

procedures for protecting users on Web Exchange Site."

"Customer choice," another one, and "customer trials on

devices." And here we have four factors for

consideration, and we'll say we need all of them.

"Technical assistance" and then "customer intake" --

let's change that in a minute. This is not a very long

category at all. You can see how simple this is.

You're simply reading through the indicator,

the rationale for the indicator, the key factors to

consider, and determine basically if we put check boxes

here, you could just check "Yes, we do that; no, we

don't do that; yes, we do this," and then, "Well, we

meet some of those things."

And here is our results page. And at the top

of the results page, you'll see the date that you took

the survey. And this is showing the date that I

originally took the survey back in June and the profile

I've created. And then it tells me that the responses

indicate that "The program meets promising practices,

some but not all aspects of this category, and this list

may be helpful to you." And so it lists the indicator,

equipment delivery to customer, best practices were met.

So we have no "resources for." But "matching devices to

customers" -- we have checklists because we did not meet

all of those factors for consideration. There's a

laundry list of resources in the knowledge base, and

these are in the knowledge base unless it tells you

otherwise. And it tells you that, and it's Pass It On

Center knowledge base. Look at these things.

So these are the names of articles or objects

in the knowledge base that would be helpful to you. And

in this case, you see matching devices to customers;

there's a checklist; there's a work flow; there are user

agreements and liability release examples from actual

reuse programs. Then there are articles on fitting

crutches, selecting and fitting canes, and then there

are several presentations by reuse professionals on

matching persons to equipment. Those are actually Power

Point presentations from conferences.

Then another indicator, a Web Exchange

Services; we've listed some resources in the Knowledge

base. And below that, you will see the W3C standards at

this web address are external resources that you can use

in this case. If we're looking at external resources,

for example, under "customer choice," we actually have

some links that will take you from that over to

resources on the Internet that are not Pass It On Center

resources. And you can see how long this list is in

this case. And you have the option to print the list if

you'd like to save it.

And then you can go back and continue in

another category of the survey, or you can take your

results and look at them and create these slide analyses

we discussed; you can simply have a discussion with

people about your results; you could assign people to

review some of the suggested resources and make plans

from those; identify tasks and responsibilities. If you

look at those resources, you may actually find the

solution to a part that you don't have. If you need a

procedure for doing something, you may find an example

of the procedure from an existing program. You may find

the forms they use or the tools that they have employed

to actually comply with this indicator.

Any comments on this? Questions or comments?

Now, if you've set up your account, when you

come back and log in again, you'll see, as I did, the

history of what I've done before. So as you log into

your account, as you return, that's what you'll notice.

Any questions?

Well, we'd like to encourage you to use the

tool in some fashion, perhaps one we suggested or one we

haven't. And there is a form on the Knowledge base home

page. You'll see that with the name of today's webinar,

"Using the IQ-ATR for Improving Your Reuse Program."

There's a brief description in just narrative

text of what we've talked about today. And there's a

form that if you use the tool between now and the end of

January, we would love to have you submit that, tell us

how you used it, what you thought about the tool, was it

helpful, what we could do to change it. And we're going

to review those suggestions and send gift cards to

people that have come up with some ideas that would be

really helpful to other programs.

Any comments?

Let's see if I can go back to this because we

would like you to evaluate the webinar.

JOY KNISKERN: And Trish, we'll be sending out

an evaluation form for the webinar so that you could

complete that and give us some feedback.

CAROLINE VAN HOWE: Trish, if you want to get

back to the presentation, you should use the previous

little green button to the left-hand side of the URL,

and that should take you back all the way through the

web pages back to the original presentation.

JOY KNISKERN: Good question, Vivian.

"Is the webinar available with a Spanish


TRISH REDMON: The webinar or the tool,


No, it isn't, but that's a wonderful

suggestion. And I'll make a note of that because I

think we could probably do that. That would be

wonderful, Vivian. If you could help us translate this,

we would love to have that.

One of the things we're doing now is

collecting suggestions for improving this.

"Is it accessible?"

Mark, I don't know. I really don't know

completely the answer to that. I thought we worked on


Is it, Joy?

"Yes." Okay.

Joy says it is accessible. Caroline says it

is in HTML.

I think we put in some work on making this

accessible, but I'll go back and verify that.

And Caroline, I'm struggling to get back to my


Thank you all for participating today. We

hope you'll try it. Okay. I can't go back to it. We

will get back to our evaluation page so that you will

know where you can evaluate the webinar today.

Thank you all for participating. Here's the

web address for our survey book evaluation. We would

really appreciate it if you would complete that. And I

would like to encourage you to go to the Knowledge base

and download the form and try using the tool in the next

two months and tell us about your experience.

JOY KNISKERN: This is Joy, and I just wanted

to say thank you again for joining us today. We hope

that all of you will have an opportunity to really,

truly consider going to ATIA in Orlando.

As was mentioned earlier, we have a full

schedule of different presentations that will be taking

place there along with the preconference session on

starting a reuse program and operating a reuse program,

and we're very interested in getting as much

participation as we can.

Beyond that, keep your eyes tuned for Liz's

announcements about webinars that are coming up, and

please do feel free to share with us any ideas that you

have about webinars that you feel in particular would be

very useful to you.

We hope in the new year that we'll have an

opportunity to contact those of you who have reuse

programs that have been listening in to find out what's

going on in your state, what's going on in your


And again, Vivian, thank you so much for

offering to work with us on Spanish translation. I know

we'll be back in touch with you. Liz Persaud will be

back in touch with you, and we'll share this information

with her.

Any other comments that Caroline or Trish

would like to share or anyone else?