Warning! OUTDATED BROWSER DETECTED!   Please update your browser immediately for a better experience on this website. Learn More
  • R&E Foundation Seed The Future
  • Grant Application Toolbox

  • Toolbox and Resources Useful in the Design and Preparation of an Education Grant Application

    There are several considerations and subsequent steps to follow once an idea for an educational research project has been put forth. This document aims to provide some tools for how to develop the idea into a successful educational research project, and where to look for additional information when building a project around an educational idea.

    Develop Your Idea

    When developing a topic for research here are some guidelines to assist you in developing your idea.

    What is your area of curiosity?


    For each research question related to the your area of curiosity, answer the following questions:

    What does this have to do with anything?
    If you are having a hard time answering this question, then you should reevaluate your research question and/or topic.

    Who cares about the result?
    If you are the only one who is curious, then find out if it could benefit others. This may require exploring the literature or reaching out to others in the field if the results would help them.

    But we already knew that!
    If your idea has been already proven in the research, try to modify it to fit the missing elements that have not been explored, or start with a new idea.

    Are you playing with black boxes?
    Is it something that the inner workings are unknown? The input and the output are the only known components, but the mechanism for the system is unknown.

    Is this research or evaluation?
    Research is based on the developing a hypothesis and specific aims, and evaluation is a measure of assessing an individual’s progress. Research contributes to the greater knowledge of a area, and evaluation is an assessment of a program, and the results are often submitted back to a supporting or governing body (publication to literature is less likely).

    Seven Steps to Developing a Successful Research Project

    A list of scenarios and questions are provided below. Each question is of variable scope. Any given research project might address only one or two question(s) of a larger, more complex scenario.

    First step:
    Formulate a research question. What is the problem?

    • What is currently known in this area within the literature?
    • Are there resources from outside Radiology assistive in the development?
    • What is the need or benefit for the specialty when this area of investigation is achieved?
    • How will this investigation add to the literature?
    • What is the significance?
    • If this educational project is to replace current educational programming, how is this an improvement? The goals, objectives and outcomes should also fortify this claim.

    Second Step:
    What is/are the hypothesis/hypotheses?

    • Are hypotheses clear?
    • Is the hypothesis concise and limited in the number of confounding factors?
    • Does the study have practical or theoretical value?
    • Does the hypothesis lend itself to empirical testing?
    • Can data be obtained?

    Third step:
    Develop objectives and goals.
    Three viewpoints to consider in developing goals and objectives:

    1. Does the research promote quality?
    2. Does the research build from the existing knowledge base?
    3. Does the research enhance professional development?

    Fourth step:
    Defining your target audience/ who are the participants?

    • Who is going to be studied/taught (subjects)?
    • Who will immediately benefit from this investigation?
    • Other than the immediate beneficiaries, are there larger groups that will benefit?

    Design the research:
    1. From a learning design perspective.

    • The underlying philosophy of design research is that you have to understand the innovative forms of education that you might want to bring about in order to be able to produce them.

    2. From a technology perspective. For example:

    • Addressing complex problems in real contexts
    • Integrating known and hypothetical design principles with technological affordances to render plausible solutions to these complex problems
    • Conducting rigorous and reflective inquiry to test and refine innovative learning environments as well as to define new design principles

    3. From a curriculum perspective. For example:

    • Curriculum that addresses competency issues
    • Curriculum that addresses gaps in knowledge or skills
      • Ex: If the target audience is residents, does it include a facet form the ACGME competencies?

    Fifth Step:
    How are the objectives and goals going to be assessed?

    • If this includes a competency assessment, defining how competency is to be evaluated is important. Are their national standards currently in place?
    • Is the project appropriate to use a rubric? Is there one available or do you have to develop a new one?
    • Identify meaningful measures that are reproducible and as objective as possible.

    What are the methods or procedure?

    • Is the method appropriate to the hypothesis? Explain why.
    • Do procedures follow an orderly, logical sequence? Explain.
    • Is there evidence of review of previous studies to indicate context of this study in related body of knowledge?

    Sixth Step:
    How is data going to be collected and analyzed?

    • This will largely depend on the research design and the objectives/goals of the investigation.
    • Some potential formats of data collection: surveys, interviews, competency evaluation, etc.
    • How is the data going to be recorded?
    • How is the reliability and validity of the results going to be evaluated?
    • What statistical evaluation is going to be used in the evaluation? Why?


    • How many subjects will participate in the study?
    • Are the studied participants a representative case sample?
    • Are there a sufficient number of participants for observation?
    • Was approval to conduct the study obtained? From whom?
    • Do participants need to sign an informed consent?
    • IRB evaluation and approval

    Seventh step:
    What outcomes are expected from the results?

    • If the results are not attained, are there any contingency plans in place? Are there plans to evaluate the reasons why the goals were not attained?
    • Can the reasons for success/failure be predicted?
    • Does the program fulfill ACGME competency assessments?

    What important conclusions are expected?

    • What are the positive aspects of this research?
    • What are the negative aspects of this research?

    Back to top

    Helpful resources:

    Although this list is long, not all of these resources will apply to your specific project. If you need help with the development of an idea, these are useful resources that can help with initial steps or tying up loose ends.

    Rubrics – understanding and guide to developing

    Rubrics are helpful means to assess a learner as they develop guidelines by which the learner is assessed upon their own progress and standards that are previously set. This can avoid the initial comparison of one learner against another, and still maintain the milestones that are required for progression.

    Previously developed rubrics can be used; however, sometimes a project/program requires the development of a more specific rubric. Here are some steps that might make the task easier:

    • Identify the concept or goal you are assessing.
    • Identify the important dimensions or characteristics of that concept.
    • Provide a stratification of what you expect:
      • Best work provides a description of the highest level of performance.
      • Worst acceptable outcome describes the minimal or basic level of performance.
      • Provide a description of an unacceptable outcome that describes the lowest category of performance.
    • Develop descriptions of intermediate‐level products and assign them to intermediate categories. This might resemble a Likert scale that runs from 1 to 5 (unacceptable, marginal, acceptable, good, outstanding), 1 to 3 (novice, competent, exemplary), or any other set that is meaningful.
    • For feedback, try out the rubric on colleagues. If the rubric adequately assesses your concept/goal and is achievable for the audience it should be applied. Then, revise accordingly.
    • Scoring Rubrics: What, When and How, an article from Practical Assessment, Research and Evaluation (PARE), http://pareonline.net/getvn.asp?v=7&n=3  
    • NC State University, University Planning & Analysis, includes extensive list of resources for assessment of specific skills or content, http://www2.acs.ncsu.edu/UPA/assmt/resource.htm  
    • Carnegie Mellon University: Enhancing Education Through Assessment, http://www.cmu.edu/teaching/assessment/index.html

    Curriculum development

    • Dent, J., & Harden, R. M. (2013). A Practical Guide for Medical Teachers. Churchill Livingstone. Section 3: Educational Strategies  
    • Diamond, R. M. (2011). Designing and Assessing Courses and Curricula. John Wiley & Sons. Chapters 1 (A learning centered approach to course and curriculum design), 2 (expanding role of faculty in accreditation and accountability), 4 (scholarship and faculty rewards), 5 (Introduction to the model and its benefits), 9 (linking goals, courses and curricula), 10 (gathering and analyzing essential data), 20 (meeting the needs of adult learners)  
    • Kern, D. E., Thomas, P. A., & Hughes, M. T. (2009). Curriculum Development for Medical Education. Johns Hopkins University Press.
    • Southgate, L., Hays, R. B., Norcini, J., Mulholland, H., Ayers, B., Woolliscroft, J., et al. (2001). Setting performance standards for medical practice: a theoretical framework. Medical Education, 35(5), 474–481. doi:10.1046/j.1365-2923.2001.00897.x


    • Dent, J., & Harden, R. M. (2013). A Practical Guide for Medical Teachers. Churchill Livingstone. Section 6: Assessment
    • Suskie, L. (2010). Assessing Student Learning. John Wiley & Sons. [nice resource for all around general information]
    • Walvoord, B. E. (2010). Assessment clear and simple: A practical guide for institutions, departments, and general education.[nice basic resource]

    Validity and Reliability

    • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research & Evaluation, 7(10), 1–11.
    • Shavelson, R. J., & Huang, L. (2003). Responding Responsibly. Change (Abstracts), 35(1), 10–19.

    Clinical Education and Assessment           

    • Farmer, E. A., & Page, G. (2005). A practical guide to assessing clinical decision-making skills using the key features approach. Medical Education, 39(12), 1188–1194. doi:10.1111/j.1365-2929.2005.02339.x
    • Friedman, A. J., Cosby, R., Boyko, S., Hatton-Bauer, J., & Turnbull, G. (2010). Effective Teaching Strategies and Methods of Delivery for Patient Education: A Systematic Review and Practice Guideline Recommendations. Journal of Cancer Education, 26(1), 12–21. doi:10.1007/s13187-010-0183-x


    If the project is focused on ACGME based competencies (which the ACGME defers the details to each program), review the resources at the ACGME website. Each residency program director should have access to the department’s specific competencies requirements. It would be helpful to contact the residency program director to ensure the project is in alignment with the residency goals.

    • http://www.acgme.org/acgmeweb/  
    • http://www.acgme.org/acgmeweb/Portals/0/PDFs/commonguide/VA1_Evaluation_ResidentFormativeEval_Documentation.pdf (search under ACGME site for ‘competency assessment methods’)
    • Gunderman, R. B. (2009). Competency-based Training: Conformity and the Pursuit of Educational Excellence. Radiology, 252(2), 324–326. doi:10.1148/radiol.2522082183
    • Leung, W.-C. (2002). Competency based medical training: review. BMJ (Clinical research ed.), 325(7366), 693–696.
    • Morag, E., Lieberman, G., Volkan, K., Shaffer, K., Novelline, R., & Lang, E. V. (2001). Clinical competence assessment in radiology: introduction of an objective structured clinical examination in the medical school curriculum. Academic radiology, 8(1), 74–81. doi:10.1016/S1076-6332(03)80746-8      
    • Newble, D. (2004). Techniques for measuring clinical competence: objective structured clinical examinations. Medical Education, 38(2), 199–203. doi:10.1046/j.1365-2923.2004.01755.x
    • Rothwell, W. J., & Graber, J. M. (2010). Competency-Based Training Basics. American Society for Training and Development.
    • Smee, S. (2003). ABC of learning and teaching in medicine: skill based assessment. BMJ: British Medical Journal, 326(7391), 703.
    • Swanson, D. B., Norman, G. R., & Linn, R. L. (1995). Performance-Based Assessment: Lessons From the Health Professions. Educational researcher, 24(5), 5–11. doi:10.3102/0013189X024005005
    • Williamson, K. B., Steele, J. L., Gunderman, R. B., Wilkin, T. D., Tarver, R. D., Jackson, V. P., & Kreipke, D. L. (2002). Assessing Radiology Resident Reporting Skills. Radiology, 225(3), 719–722. doi:10.1148/radiol.2253011335

    Accreditation in higher education

    • Wergin, J. (2005a). Higher Education Waking Up to the Importance of Accreditation. Change (Abstracts), 35–41.
    • Wergin, J. (2005b). Taking responsibility for student learning: the role of accreditation. Change (Abstracts), 37(1), 30–33. doi:10.3200/CHNG.37.1.30-33
    • If working with resident education programs, it is helpful to refer to the ACGME website for any new developments. http://www.acgme.org/acgmeweb/

    Grant Writing



    Sample RSNA Education Scholar Grant Applications
    These applications scored well at study section and were considered to be of high quality overall; however, do not assume each section of every application is exemplary.

    Back to top

    Additional resources for those wanting more depth:

    Learning theory

    • Bransford, J., Vye, N., Stevens, R., Kuhl, P., Schwartz, D., Bell, P., et al. (2005). Learning theories and education: Toward a decade of synergy. Handbook of Educational Psychology (2nd Edition).
    • Brookfield, S. (1995). Adult learning: An overview. International encyclopedia of education, 1–16.
    • Pratt, D. D. (2006). Three stages of teacher competence: A developmental perspective. New Directions For Adult & Continuing Education, 1989(43), 77–87. doi:10.1002/ace.36719894309


    • Stevens, D. D., & Levi, A. J. (2005). Introduction To Rubrics. Stylus Publishing, LLC.   

    Curriculum Development

    • Heirich, M. (1980). The people we teach: aids to course planning. Teaching Sociology, 281–302.   

    Educational Research

    • Burkhardt, H., & Schoenfeld, A. H. (2003). Improving educational research: Toward a more useful, more influential, and better-funded enterprise. Educational researcher, 32(9), 3–14.        
    • Education, C. O. R. I., National Research Council. (2004). Advancing Scientific Research in Education. National Academies Press. 
    • van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introducing educational design research. Routledge New York, NY.   


    • Ewell, P. T. (2005). Can Assessment Serve Accountability? It Depends on the Question. In J. Wergin (Ed.), Achieving Accountability in Higher Education; balancing public, academic, and market demands (pp. 104–124). San Francisco: Jossey-Bass.
    • Palomba, C. A., & Banta, T. W. (1999a). The essentials of successful assessment. In Assessment Essentials (p. 405). Jossey-Bass.
    • Palomba, C. A., & Banta, T. W. (1999b). Selecting methods and approaches. In Assessment essentials (pp. 85–113). Jossey-Bass.
    • Pike, G. R. (2002). Measurement issues in outcomes assessment. Building a scholarship of assessment, 131–164.
    • Schuh, J. H. (2009). Assessment methods for student affairs. Jossey-Bass Inc Pub. Chapter 3: Planning for and implementing data collection; Chapter 4: Selecting, sampling and soliciting subjects
    • Walvoord, B. E. (2003). Assessment in accelerated learning programs: A practical guide. New Directions For Adult & Continuing Education, 2003(97), 39–50. doi:10.1002/ace.87
    • Wang, X., & Hurley, S. (2012). Assessment as a Scholarly Activity?: Faculty Perceptions of and Willingness to Engage in Student Learning Assessment.

    Back to top