21/xsl/MobileMenu.xsltmobileNave880e1541/WorkArea//http://www.rsna.org/TwoColumnWireframe.aspx?pageid=10667&ekfxmen_noscript=1&ekfxmensel=falsefalsetruetruetruefalsefalse102e880e1541_21_3598.0.0.0730truefalse
  • Grant Application Toolbox

  • Toolbox and Resources Useful in the Design and Preparation of an Education Grant Application

    There are several considerations and subsequent steps to follow once an idea for an educational research project has been put forth. This document aims to provide some tools for how to develop the idea into a successful educational research project, and where to look for additional information when building a project around an educational idea.

    Seven Steps to Developing a Successful Research Project

    A list of scenarios and questions are provided below. Each question is of variable scope. Any given research project might address only one or two question(s) of a larger, more complex scenario.

    First step:
    Formulate a research question. What is the problem?

    • What is currently known in this area within the literature?
    • Are there resources from outside Radiology assistive in the development?
    • What is the need or benefit for the specialty when this area of investigation is achieved?
    • How will this investigation add to the literature?
    • What is the significance?
    • If this educational project is to replace current educational programming, how is this an improvement? The goals, objectives and outcomes should also fortify this claim.

    Second Step:
    What is/are the hypothesis/hypotheses?

    • Are hypotheses clear?
    • Is the hypothesis concise and limited in the number of confounding factors?
    • Does the study have practical or theoretical value?
    • Does the hypothesis lend itself to empirical testing?
    • Can data be obtained?

    Third step:
    Develop objectives and goals.
    Three viewpoints to consider in developing goals and objectives:

    1. Does the research promote quality?
    2. Does the research build from the existing knowledge base?
    3. Does the research enhance professional development?

    Fourth step:
    Defining your target audience/ who are the participants?

    • Who is going to be studied/taught (subjects)?
    • Who will immediately benefit from this investigation?
    • Other than the immediate beneficiaries, are there larger groups that will benefit?

    Design the research:
    1. From a learning design perspective.

    • The underlying philosophy of design research is that you have to understand the innovative forms of education that you might want to bring about in order to be able to produce them.

    2. From a technology perspective. For example:

    • Addressing complex problems in real contexts
    • Integrating known and hypothetical design principles with technological affordances to render plausible solutions to these complex problems
    • Conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design principles

    3. From a curriculum perspective. For example:

    • Curriculum that addresses competency issues
    • Curriculum that addresses gaps in knowledge or skills
      • Ex: If the target audience is residents, does it include a facet form the ACGME competencies?

    Fifth Step:
    How are the objectives and goals going to be assessed?

    • If this includes a competency assessment, defining how competency is to be evaluated is important. Are their national standards currently in place?
    • Is the project appropriate to use a rubric? Is there one available or do you have to develop a new one?
    • Identify meaningful measures that are reproducible and as objective as possible.

    What are the methods or procedure?

    • Is the method appropriate to the hypothesis? Explain why.
    • Do procedures follow an orderly, logical sequence? Explain.
    • Is there evidence of review of previous studies to indicate context of this study in related body of knowledge?

    Sixth Step:
    How is data going to be collected and analyzed?

    • This will largely depend on the research design and the objectives/goals of the investigation.
    • Some potential formats of data collection: surveys, interviews, competency evaluation, etc.
    • How is the data going to be recorded?
    • How is the reliability and validity of the results going to be evaluated?
    • What statistical evaluation is going to be used in the evaluation? Why?

    Participants

    • How many subjects will participate in the study?
    • Are the studied participants a representative case sample?
    • Are there a sufficient number of participants for observation?
    • Was approval to conduct the study obtained? From whom?
    • Do participants need to sign an informed consent?
    • IRB evaluation and approval

    Seventh step:
    What outcomes are expected from the results?

    • If the results are not attained, are there any contingency plans in place? Are there plans to evaluate the reasons why the goals were not attained?
    • Can the reasons for success/failure be predicted?
    • Does the program fulfill ACGME competency assessments?

    What important conclusions are expected?

    • What are the positive aspects of this research?
    • What are the negative aspects of this research?

    Back to top 

    Helpful resources:

    Although this list is long, not all of these resources will apply to your specific project. If you need help with the development of an idea, these are useful resources that can help with initial steps or tying up loose ends.

    Rubrics – understanding and guide to developing 

    Rubrics are helpful means to assess a learner as they develop guidelines by which the learner is assessed upon their own progress and standards that are previously set. This can avoid the initial comparison of one learner against another, and still maintain the milestones that are required for progression.

    Previously developed rubrics can be used; however, sometimes a project/program requires the development of a more specific rubric. Here are some steps that might make the task easier:

    • Identify the concept or goal you are assessing.
    • Identify the important dimensions or characteristics of that concept.
    • Provide a stratification of what you expect:
      • Best work provides a description of the highest level of performance.
      • Worst acceptable outcome describes the minimal or basic level of performance.
      • Provide a description of an unacceptable outcome that describes the lowest category of performance.
    • Develop descriptions of intermediate‐level products and assign them to intermediate categories. This might resemble a Likert scale that runs from 1 to 5 (unacceptable, marginal, acceptable, good, outstanding), 1 to 3 (novice, competent, exemplary), or any other set that is meaningful.
    • For feedback, try out the rubric on colleagues. If the rubric adequately assesses your concept/goal and is achievable for the audience it should be applied. Then, revise accordingly.
    • Scoring Rubrics: What, When and How, an article from Practical Assessment, Research and Evaluation (PARE), http://pareonline.net/getvn.asp?v=7&n=3  
    • NC State University, University Planning & Analysis, includes extensive list of resources for assessment of specific skills or content, http://www2.acs.ncsu.edu/UPA/assmt/resource.htm  
    • Carnegie Mellon University: Enhancing Education Through Assessment, http://www.cmu.edu/teaching/assessment/index.html 

    Curriculum development 

    • Dent, J., & Harden, R. M. (2013). A Practical Guide for Medical Teachers. Churchill Livingstone. Section 3: Educational Strategies  
    • Diamond, R. M. (2011). Designing and Assessing Courses and Curricula. John Wiley & Sons. Chapters 1 (A learning centered approach to course and curriculum design), 2 (expanding role of faculty in accreditation and accountability), 4 (scholarship and faculty rewards), 5 (Introduction to the model and its benefits), 9 (linking goals, courses and curricula), 10 (gathering and analyzing essential data), 20 (meeting the needs of adult learners)  
    • Kern, D. E., Thomas, P. A., & Hughes, M. T. (2009). Curriculum Development for Medical Education. Johns Hopkins University Press.
    • Southgate, L., Hays, R. B., Norcini, J., Mulholland, H., Ayers, B., Woolliscroft, J., et al. (2001). Setting performance standards for medical practice: a theoretical framework. Medical Education, 35(5), 474–481. doi:10.1046/j.1365-2923.2001.00897.x

      Assessment 

    • Dent, J., & Harden, R. M. (2013). A Practical Guide for Medical Teachers. Churchill Livingstone. Section 6: Assessment 
    • Suskie, L. (2010). Assessing Student Learning. John Wiley & Sons. [nice resource for all around general information]
    • Walvoord, B. E. (2010). Assessment clear and simple: A practical guide for institutions, departments, and general education.[nice basic resource]

    Validity and Reliability 

    • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research & Evaluation, 7(10), 1–11.
    • Shavelson, R. J., & Huang, L. (2003). Responding Responsibly. Change (Abstracts), 35(1), 10–19.

    Clinical Education and Assessment            

    • Farmer, E. A., & Page, G. (2005). A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education, 39(12), 1188–1194. doi:10.1111/j.1365-2929.2005.02339.x
    • Friedman, A. J., Cosby, R., Boyko, S., Hatton-Bauer, J., & Turnbull, G. (2010). Effective Teaching Strategies and Methods of Delivery for Patient Education: A Systematic Review and Practice Guideline Recommendations. Journal of Cancer Education, 26(1), 12–21. doi:10.1007/s13187-010-0183-x

    Competency 

    If the project is focused on ACGME based competencies (which the ACGME defers the details to each program), review the resources at the ACGME website. Each residency program director should have access to the department’s specific competencies requirements. It would be helpful to contact the residency program director to ensure the project is in alignment with the residency goals.

    • http://www.acgme.org/acgmeweb/  
    • http://www.acgme.org/acgmeweb/Portals/0/PDFs/commonguide/VA1_Evaluation_ResidentFormativeEval_Documentation.pdf (search under ACGME site for ‘competency assessment methods’)
    • Gunderman, R. B. (2009). Competency-based Training: Conformity and the Pursuit of Educational Excellence. Radiology, 252(2), 324–326. doi:10.1148/radiol.2522082183
    • Leung, W.-C. (2002). Competency based medical training: review. BMJ (Clinical research ed.), 325(7366), 693–696.
    • Morag, E., Lieberman, G., Volkan, K., Shaffer, K., Novelline, R., & Lang, E. V. (2001). Clinical competence assessment in radiology: introduction of an objective structured clinical examination in the medical school curriculum. Academic radiology, 8(1), 74–81. doi:10.1016/S1076-6332(03)80746-8      
    • Newble, D. (2004). Techniques for measuring clinical competence: objective structured clinical examinations. Medical Education, 38(2), 199–203. doi:10.1046/j.1365-2923.2004.01755.x
    • Rothwell, W. J., & Graber, J. M. (2010). Competency-Based Training Basics. American Society for Training and Development.
    • Smee, S. (2003). ABC of learning and teaching in medicine: skill based assessment. BMJ: British Medical Journal, 326(7391), 703.
    • Swanson, D. B., Norman, G. R., & Linn, R. L. (1995). Performance-Based Assessment: Lessons From the Health Professions. Educational researcher, 24(5), 5–11. doi:10.3102/0013189X024005005
    • Williamson, K. B., Steele, J. L., Gunderman, R. B., Wilkin, T. D., Tarver, R. D., Jackson, V. P., & Kreipke, D. L. (2002). Assessing Radiology Resident Reporting Skills. Radiology, 225(3), 719–722. doi:10.1148/radiol.2253011335

    Accreditation in higher education              

    • Wergin, J. (2005a). Higher Education Waking Up to the Importance of Accreditation. Change (Abstracts), 35–41.
    • Wergin, J. (2005b). Taking responsibility for student learning: the role of accreditation. Change (Abstracts), 37(1), 30–33. doi:10.3200/CHNG.37.1.30-33
    • If working with resident education programs, it is helpful to refer to the ACGME website for any new developments. http://www.acgme.org/acgmeweb/ 

    Grant Writing 

    RSNA 

    Other 

    Back to top 

    Additional resources for those wanting more depth:

    Learning theory 

    • Bransford, J., Vye, N., Stevens, R., Kuhl, P., Schwartz, D., Bell, P., et al. (2005). Learning theories and education: Toward a decade of synergy. Handbook of Educational Psychology (2nd Edition).
    • Brookfield, S. (1995). Adult learning: An overview. International encyclopedia of education, 1–16.
    • Pratt, D. D. (2006). Three stages of teacher competence: A developmental perspective. New Directions For Adult & Continuing Education, 1989(43), 77–87. doi:10.1002/ace.36719894309

    Rubrics 

    • Stevens, D. D., & Levi, A. J. (2005). Introduction To Rubrics. Stylus Publishing, LLC.   

    Curriculum Development 

    • Heirich, M. (1980). The people we teach: aids to course planning. Teaching Sociology, 281–302.   

    Educational Research 

    • Burkhardt, H., & Schoenfeld, A. H. (2003). Improving educational research: Toward a more useful, more influential, and better-funded enterprise. Educational researcher, 32(9), 3–14.        
    • Education, C. O. R. I., National Research Council. (2004). Advancing Scientific Research in Education. National Academies Press. 
    • van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introducing educational design research. Routledge New York, NY.   

    Assessment 

    • Ewell, P. T. (2005). Can Assessment Serve Accountability? It Depends on the Question. In J. Wergin (Ed.), Achieving Accountability in Higher Education; balancing public, academic, and market demands (pp. 104–124). San Francisco: Jossey-Bass.
    • Palomba, C. A., & Banta, T. W. (1999a). The essentials of successful assessment. In Assessment Essentials (p. 405). Jossey-Bass.
    • Palomba, C. A., & Banta, T. W. (1999b). Selecting methods and approaches. In Assessment essentials (pp. 85–113). Jossey-Bass.
    • Pike, G. R. (2002). Measurement issues in outcomes assessment. Building a scholarship of assessment, 131–164.
    • Schuh, J. H. (2009). Assessment methods for student affairs. Jossey-Bass Inc Pub. Chapter 3: Planning for and implementing data collection; Chapter 4: Selecting, sampling and soliciting subjects 
    • Walvoord, B. E. (2003). Assessment in accelerated learning programs: A practical guide. New Directions For Adult & Continuing Education, 2003(97), 39–50. doi:10.1002/ace.87
    • Wang, X., & Hurley, S. (2012). Assessment as a Scholarly Activity?: Faculty Perceptions of and Willingness to Engage in Student Learning Assessment.

    Back to top 

We appreciate your comments and suggestions in our effort to improve your RSNA web experience.

Name (required)

 

Email Address (required)

 

Comments (required)

 

 

 

 

Discounted Dues: Eligible North American Countries 
Belize
Costa Rica
Dominican Republic
El Salvador
Grenada
Guatamala
Haiti
Honduras
Jamaica
Netherlands Antilles
Nicaragua
Panama
St.Lucia
St. Vincent & Grenadines
Country    Country    Country 
Afghanistan   Grenada   Pakistan
Albania   Guatemala   Papua New Guinea
Algeria   Guinea   Paraguay
Angola   Guinea-Bissau   Peru
Armenia   Guyana   Phillippines
Azerbaijan   Haiti   Rwanda
Bangladesh   Honduras   Samoa
Belarus   India   Sao Tome & Principe
Belize   Indonesia   Senegal
Benin   Iran   Serbia
Bhutan   Iraq   Sierra Leone
Bolivia   Jordan   Solomon Islands
Bosnia & Herzegovina   Jamaica   Somalia
Botswana   Kenya   South Africa
Bulgaria   Kiribati   South Sudan
Burkina Faso   Korea, Dem Rep (North)   Sri Lanka
Burundi   Kosovo   St Lucia
Cambodia   Kyrgyzstan   St Vincent & Grenadines
Cameroon   Laos\Lao PDR   Sudan
Cape Verde   Lesotho   Swaziland
Central African Republic   Liberia   Syria
Chad   Macedonia   Tajikistan
China   Madagascar   Tanzania
Colombia   Malawi   Thailand
Comoros   Maldives   Timor-Leste
Congo, Dem. Rep.   Mali   Togo
Congo, Republic of   Marshall Islands   Tonga
Cote d'Ivoire   Mauritania   Tunisia
Djibouti   Micronesia, Fed. Sts.   Turkmenistan
Dominica   Moldova   Tuvalu
Domicican Republic   Mongolia   Uganda
Ecuador   Montenegro   Ukraine
Egypt   Morocco   Uzbekistan
El Salvador   Mozambique   Vanuatu
Eritrea   Myanmar   Vietnam
Ethiopia   Namibia   West Bank & Gaza
Fiji   Nepal   Yemen
Gambia, The   Nicaragua   Zambia
Georgia   Niger   Zimbabwe
Ghana   Nigeria    

Legacy Collection 2
Radiology Logo
RadioGraphics Logo 
Tier 1

  • Bed count: 1-400
  • Associate College: Community, Technical, Further Education (UK), Tribal College
  • Community Public Library (small scale): general reference public library, museum, non-profit administration office

Tier 2

  • Bed count: 401-750
  • Baccalaureate College or University: Bachelor's is the highest degree offered
  • Master's College or University: Master's is the highest degree offered
  • Special Focus Institution: theological seminaries, Bible colleges, engineering, technological, business, management, art, music, design, law

Tier 3

  • Bedcount: 751-1,000
  • Research University: high or very high research activity without affiliated medical school
  • Health Profession School: non-medical, but health focused

Tier 4

  • Bed count: 1,001 +
  • Medical School: research universities with medical school, including medical centers

Tier 5

  • Consortia: academic, medical libraries, affiliated hospitals, regional libraries and other networks
  • Corporate
  • Government Agency and Ministry
  • Hospital System
  • Private Practice
  • Research Institute: government and non-government health research
  • State or National Public Library
  • Professional Society: trade unions, industry trade association, lobbying organization