NORDP 2015 Conference Report: Strategies to Support Multi-Institutional, Cross-Conference Research Collaborations

by Marilyn Korhonen, Ed.D.
Associate Director Center for Research Program Development and Enrichment
Office of the Vice President for Research, University of Oklahoma

Panelists: Martha Cooper and Nathan Meier

The Traumatic Brain Injury Project is a great example of being ready to seize an opportunity, and it serves as a lesson to watch for such opportunities and be flexible enough to respond. One important aspect of the project is the timeliness and critical need to address concerns about the impact of more than 3 million brain injuries that occur each year, many of which are associated with college athletics programs.

A second critical factor is the presence of an established organizing structure. In particular, the Big Ten Athletic Conference and the Ivy League came together to improve traumatic brain injury prevention, detection, and treatment strategies. While this project aligns with athletic conferences, it is enabled by the Committee on Institutional Cooperation (CIC), which is a consortium of the Big Ten member universities plus the University of Chicago. The CIC has been in place for more than 50 years, enabling the member institutions to advance their academic missions by sharing expertise, leveraging campus resources, and collaborating on innovative programs.

A third factor is the ability to govern, fund, and staff the project quickly and equitably. CIC is governed and funded by the Provosts of the member universities, and coordinated by a staff from its Champaign, Illinois headquarters. Thus the project had a natural, established home. This governing body allowed for focused goals and focused approaches implemented in coordinated ways.

Most of these factors exist primarily outside of research development. So a fourth important factor is to make a case for research and scholarship, and to leverage the resources established for the program. In this case, having a larger sample size of athletes with potential traumatic brain injuries enabled use of evidence-based, clinical protocols. These protocols may lead to collaboration with the Department of Defense, allowing for comparison of TBI based on a greater number of factors.

Finally, the University of Nebraska was in a position to provide leadership as well as physical resources to create a Center for Brain Biology and Behavior, which is attached to their Athletics Performance Lab (all within the football stadium). This strengthens their research program and provides even greater resources to the overall TBI Project. This project has already resulted in 22 research collaboration efforts and 12 distinct sources of funding.

Some of the challenges include:

  1. Increased competition for limited federal funds
  2. Balancing tensions between collaboration and competition
  3. Lack of equity in the institutional contributions of seed funding and other support toward the project.

Ultimately, the presenters expect that a strong focus on common goals will be the key to the success of their project.

NORDP 2015 Conference Report: Building an NIH Portfolio Without a Local Medical School

By Karen Markin, PhD, Director of Research Development, University of Rhode Island
To build a portfolio of grants from the National Institutes of Health at an institution without a medical school, it is essential to understand the agency’s mission, according to Janet E. Nelson, associate vice chancellor for research development at the University of Tennessee. That mission is to seek knowledge that enhances health, lengthens life and reduces illness and disability.

Nelson was one of three panelists who discussed strategic planning for successful grant-seeking from NIH in an increasingly competitive environment. The panel was part of NORDP’s annual Research Development Conference in Bethesda, MD.

Award rates at NIH are falling, noted Jennifer L. Webster, manager of strategic research initiatives at the University of Tennessee. However, it is still making grants focused on certain initiatives, including precision medicine, antibiotic resistance, cancer, brain research, Alzheimer’s disease and new vaccines.

Institutions without medical schools can compete by focusing on their strengths relative to other institutions. Panelists urged participants to think about the unique strengths of their institutions. For example, panelists Meredith Murr said the University of California at Santa Barbara has a top engineering department with talents it can leverage into NIH awards. The institution also hires strategically, focusing on medical researchers, and buildings collaborations outside the university.

Other tips from the panelists:

  • Invite an NIH program officer to speak at your campus.
  • Organize quarterly networking events and involve off-campus groups
  • Conduct red-team reviews on grant proposals.
  • Offer proposal development workshops.

NORDP 2015 Conference Report: Preparing Competitive STEM Education Development Proposals: Planning for Sustained Adoption

By Vanity Campbell
Proposal development and project management of large STEM education proposals often lack design elements to ensure sustained adoption of successful programs. The presenters shared 6 key best practices to increase the impact and systemic change anticipated of such proposals. In the past, funding agencies have encouraged a solitary cyclic model for STEM education improvement based on research, evaluation, and implementation of innovative methods.  This system relied on isolated, individualized development of new methods by researchers with limited outside feedback and involvement.  The dissemination of program outcomes was often passive utilizing conferences, websites, and publication to share program results.  As a result, proposed solutions were unique to specific institutions and lacked transferability and scalability.

New federal agency trends are emerging in STEM education development to address limited adoption and portability and broaden dissemination. To identify best practices, the presenters researched successful practices by analyzing 75 NSF CCLI grant proposals funded in 2009, case studies of well-propagated innovations (PhET, PLTL, Peer Instruction), and a review of recent related literature.  The results showed that effective propagation requires 6 key elements:

  • identification of potential adopters,
  • extensive plan for attracting, training, and supporting adopters,
  • addresses propagation early while the program is ongoing,
  • relevant instructional system elements identified,
  • provides a clearly identified plan, with rationale and strategy defined,
  • innovation, potential adopters, and selected strategies are aligned

As a hands-on activity, the presenters led workshop participants through an evaluation of a proposed STEM education proposal using an assessment instrument focusing on project type, target curricula, propagation activities, and plans.  Participants reviewed and assessed two different proposal project summaries, and compared evaluation ratings and comments.  From this exercise, participants learned that successful propagation has an intended audience, engages users, propagation plans are initiated at onset, the plan consists of an instructional system, clear and thorough plan and strategy.

Successful propagators identify potential adopters, interact with them, and support them. To achieve this, proposal planning requires interactive development, interactive dissemination, and support at three levels: individual, department, and institution.  Interactive development will include partner institutions, advisory boards and beta testing.  In comparison, isolated development involves primarily institutional stakeholders.  An interactive dissemination plan will consists of immersive workshops, leverage professional societies, pilot sites, and foster scholarship in other faculty.  Where as static dissemination should be avoided, such as dissemination of results via articles and webistes.  Lastly, adequate support will assist adopters by use of networks, customizable materials, and consultation.  This ensures successful adoption in contrast to adopters implementing new methods in isolation with no support for addressing challenges.

Implementing strong propagation plans can strengthen STEM education proposals and ensure sustained adoption and successful impact of active programs.

For contacts and additional information, see

NORDP 2015 Conference Report: Innovations in Research

By Lucy Deckard
Presenters: This session was presented by Margaret Hilton (National Research Council), James Gentile (Hope College) and Kara Hall (National Cancer Institute and member of National Research Council ) Margaret Hilton gave an overview:  This session discussed a series of reports related to Innovations in Scientific Research (specifically, interdisciplinary and team science), the latest of which came out in late April, 2015.

  • “Facilitating Interdisciplinary Research” (2005)
  • “Convergence: Facilitating Transdisciplinary Integration of Life Sciences, Physical Sciences, Engineering and Beyond” National Research Council (2014)
  • “Enhancing the Effectiveness of Team Science” National Research Council (2015)

The first report defined “interdisciplinary” vs. “transdisciplinary” (transcends disciplinary boundaries). In theory, an individual can conduct these kinds of research by him/herself, but in reality that rarely happens. This is where you get to the realm of Team Science – science conducted interdependently by more than one person.

These reports came up with some common recommendations for changes needed to promote and accommodate these new ways of doing scientific research:

  • Revise promotion and tenure policies
  • Expand funding mechanisms and review criteria
  • Conduct research/evaluation to understand and guide improved interdisciplinarity and convergence in science

James Gentile spoke about ”convergence”:

Scientific research is becoming more problem-centered. Mother Nature is winning, and she has no departmental structural constraints. In order to solve complex questions in science we need true innovation and interdisciplinary collaboration. In addition, tools in science are exploding, bringing disciplines together.

The grand challenges that we want to converge about include:

  • Green energy
  • Chemistry and physics of living systems
  • Synthetic capacity of live
  • -omics to uncover new approaches to disease
  • Others were also listed.

The Research Corp, Howard Hughes Medical Institute and others: Science Coalition coming together and made a context map   Addressing these problems means we have to go through a web that includes law, policy, economics, as well as virology .

So in the future we will have to learn how to converge. For example, brain mapping requires a lot of different types of expertise. In interdisciplinary research, it’s usually altruistic. A researcher takes a sample to a colleague in chemistry and asks if she’ll run it on her machine. She does this as a favor. In contrast, when we converge, I get my question answered but that answer presents a new question for the colleague in chemistry. In this case, everyone is growing. They coined a new term: “Scialog,” from science and dialog.

The Research Corporation for Science Advancement brought together researchers to consider a national priority: energy from photosynthesis (in essence, can we build an artificial tree based on nanotechnology). They invited researchers to form teams, but before they pitched the science, the Research Corporation just wanted to hear the justification for the team composition. They ended up funding a group that created bio-inspired silicon photovoltaics. Convergence also works in education. He gave the example of having students design robotic “cockroaches”. See and NRC, 2014.

Kara Hall talked in more detail about the recently released report, “Enhancing the Effectiveness of Team Science”:

The committee looked at factors that impacted effectiveness of science teams:

  • Individual factors
  • Factors at team/center/institute level (organizational factors)
  • Management approaches and leadership styles
  • How tenure and promotion are affected
  • etc.

The team included people with a broad range of backgrounds, including psychologists, biologists, social scientists, etc. They used several measures to evaluate effectiveness, including which research is cited more and which yielded more patents. They found that research done in teams is cited more, yields more patents, and demonstrates high levels of innovation.

They defined the following terms:

Team science – collaborative, interdependent research conducted by more than one individual

Science team – 2 – 10 individuals

Larger group – more than 10 (teams of teams)

Team effectiveness – a team’s capacity to perform

Key features that cause more challenges for team science are large membership diversity, the need to effect deep knowledge integration, (sometimes) large size, goal misalignment, permeable boundaries for teams (meaning members may move in and out as the research evolves), geographic dispersion, and high task interdependence. The concluded that there is already a strong body of research on team processes as they relate to  effectiveness, but most of that research was done on teams such as business teams (not science research teams), so we need to bring that literature into the context of science.

They identified three main areas where interventions could enhance effectiveness: team composition, team professional development, and team leadership. Kara summarized several recommendations based on current research in each of these areas.

Composing your team: Consider using task analytic methods to identify needed knowledge, skills and attitudes. These methods can be used to match task-related diversity among team or group members.  Also, consider moving outside your usual network – for example by leveraging networking tools.

Team Professional Development: Team professional development models prevalent in business could be applied to science. The committee recommended that we look at these models to see what’s out there and develop them so that they are relevant to science teams. When dealing with diverse teams and trying develop shared knowledge, it’s very important to devote time to developing a shared vocabulary. This may seem like it takes a lot of time, but in the end it will be worth it.

Leadership: The committee noted that there is already fifty years of research on teams and organizational leadership, so we should take advantage of this robust foundation and adapt it for leaders of science teams and larger groups.

The team also recommended that, in order to address the challenges of geographic dispersion, we conduct research on virtual collaboration and geographically dispersed science teams. They also recommended that dispersed teams consider task assignments within semi-independent units at each location to reduce the burden of constant electronic communication.

The team also concluded that while universities have launched new efforts to promote interdisciplinary team science (e.g., getting rid of departments), the impact of these initiatives on the amount and quality of team science has not be systematically evaluated. It may not be that the main hurdles are not actually disciplinary structures but may instead be factors such as promotion and tenure (P/T) criteria.

This points out the importance of aligning reward structures with encouraging team science. Many university P/T review policies are uncreative  and don’t give credit for team-based research. One model might be big physics, where research has long been done in very large teams. They allow researchers to get credit for pre-publications (e.g., software, databases, etc.), not just first-authored publications.

Funding agencies also need to play a role in encouraging a culture change in the scientific community. The report recommends that funders encourage development and implementation of new collaborative models (e.g., research networks, consortia). They also need to support the development of resources that support team science (e.g., info repositories, training modules, ensuring data is available for mining). We also need more targeted research about team science, but few funding programs support this research.

She also recommended that folks attend the SciTS (science of team science) 2015 conference in June 2 -5, 2015 in Bethesda, MD.

Questions and Answers:

Question: Are more diverse teams more difficult to manage?

  • Faculty are a non-pack-oriented group – leadership and administrative oversight can be difficult
  • Teams should start out small – then they can grow as they demonstrate success
  • In the team science report there was lots of discussion around diversity – we will see evolving culture shifts – more emphasis in education. A new Chief of Science Workforce Diversity at NIH was just named. As get used to more diversity, it will get easier.
  • There will be a meeting targeting provosts and deans by the National Academy highlighted their role in addressing some of these team science issues (e.g., authorship is an example  if diff disciplines have different authorship criteria)

Question:  Are there recommended strategies for forming teams?

  • When you have a team that has worked together and then bring in some new people, that often works best. If you’ve been working together too long, you can lose your innovative edge. Sometimes concatenating two teams also works.
  • When there’s too much competition among teams, it degrades the teams’ ability to work with one another, which may be needed in the future. One way to address this is to develop large networks and initiatives that include multiple centers to foster collaboration.
  • NIH is offering the opportunity to bring together scientists to think about a problem space – not a commitment. That way, teams can form and when there is an application, it’s more sophisticated
  • Another strategy is to form a team around teaching – if they can work together around teaching, a lot of research can come out of that, and it’s a way for the members to get used to working with one another (for example, developing innovative interdisciplinary non-major courses)

Question: Can core facilities help bring teams together?

  • You have to find a way to stimulate conversation. Some core facilities work better than others to stimulate interdisciplinary collaborations. The  core needs to understand that this is one of their roles.
  • Sometimes the core is competing with the people they are supposed to be supporting – they need to get rewarded for bringing people together.

The presenters thanked their sponsors: NSF and Elsevier

NORDP 2015 Conference Report: When Research Development is Just One Part of Your Job Description

By Sarah Pollock-Wisdom, Washington State University

Presenters: Michael Spires (University of Colorado, Boulder) & Kellie Dyslin (Northern Illinois University)

Background of the presenters:  Both presenters have experienced with this topic. Kellie Dyslin has been in research development for fifteen years. Her work in RD began with a nonprofit where she worked to launch projects and find grant opportunities…all without realizing that she was actually participating in research development. Now she works at NIU, where she helps investigators put their best foot forward.  Michael Spires has been in research development for ten years. He spent some time working with the Smithsonian and now works with humanities and social science faculty at UC Boulder.

This session covered four areas:

–          Objectives

–          Definitions

–          Challenges & Opportunities

–          Best Practices

The objectives for the session were that the participants would 1) Gain an understanding of the challenges and opportunities inherent in the positions that blend research and proposal development. 2) Understand how blended positions are different from separate, full-time research development roles; and 3) Gain tools for enhancing one’s ability to conduct research and proposal development as only one responsibility among many.

The presenters started by defining some of their terms (with help from the audience):

–          Research Development – the bucket of things that happens before we put a proposal together (aka relationship building; identifying community partners; internal seed grants; positioning; helping faculty talk to program officers, etc…)

–          Proposal Development – This narrows down to specific projects.  Often this is where the work starts for most of us with (formatting of bios, C&Ps, COIs, etc…).  Although we might want to help with Research Development (also referred to as the “pre-pre proposal part”), it is typically mentors in the PI’s field that assist in this area. Once you have worked with a PI once or twice, though, they are more likely to come to you for research development assistance.

–          Research Administrator – This person makes sure everything is compliant. They monitor internal & external deadlines. They may help with budgets. Assisting with budgets is a good inroad to getting more involved with research development activities. Most PIs are delighted to hand the budget off to someone else, and once they see your ability to help them, they may be open to more input from you in other areas.

The presenters asked the audience to consider a question: Is our time on the nitty-gritty details (formatting biosketches, checking margins, adding page numbers) really valuable? Although such mundane activities take our time away from the critical area of research development, the presenters argued that the time spend on details is valuable, if only for the fact that then proposals are not returned without review for noncompliance. While it may feel like we format biosketches endlessly, typically we are not working with the same PIs every time; hopefully the PIs learn from the help we give them and grow out of needing our assistance in that area. To alleviate the time spent on these mundane details, one audience member suggested hiring undergraduates to format the bios. Another audience member shared how she keeps a spreadsheet of all she does so that she can show her supervisor and justify her requests for student hires.

Next the presenters discussed the challenges and opportunities to a job that involves more than simply research development. These challenges and opportunities sometimes go hand-in-hand: Continue reading “NORDP 2015 Conference Report: When Research Development is Just One Part of Your Job Description”

Notes from the 2014 NORDP Conference: Strategies for Increasing the Competitiveness of Team Science and Center Grants

PRESENTERS:      Christine Black, University of Michigan, Jeff Horon, Elsevier

Center grants are a GIANT undertaking, and institutions may or may not have a department devoted to supporting these efforts. Below are a few low- budget tools that can research development staff can make available to faculty to support their efforts with Center grants.

  • Having someone to write the administrative cores and/or stock language about the cores that can be re-purposed.
  • Providing a reference list of freelance editors that can be hired
  • Providing a library of successful proposals that faculty can review and learn from
  • Matchmaking/Speed-dating meetings that allow people within the research community to learn about others and what research they are doing

–Notes by Anita Mills

Thank you, Anita!

NORDP 2014 Conference Notes: Presidents’ Chat

Panelists:  Alicia Knoedler, NORDP President; David Stone, NORDP President Elect; Ann McGuigan, NORDP Immediate Past President

This conference session was guided by audience questions and comments, resulting in a lively and wide-ranging session.

Regional NORDP groups: 

An audience member asked about regional NORDP groups and the presenters noted the NORDP NE group. All agreed that regional groups can be useful but volunteers are needed to run the groups and meetings can be difficult even within a compact region. The presenters suggested looking at NCURA’s regional structure. Ann noted that, as someone who has recently moved, she can see the value in regional groups.

EPPD coordinates the NORDP mentoring program, which may consider regional ties as requested by participants.  It might also be useful to provide a model for using NORDP membership lists for new members, i.e. people new to NORDP visit and review the membership list, then contact people.

The recent email traffic about bylaws and membership

Alicia noted that adoption of Bylaws does not require membership feedback. Nevertheless, the Board is developing a mechanism to allow members’ comments on policies and procedures. On the question of the difference between membership for individual consultants versus individuals working for larger consulting firms, the Board has adopted the ACA definition for the right cut-off between small and large firms. For-profit organizations with 25 or more employees will be affiliates. Firms with 25 or fewer employees will be a regular member. On the question of other models, such as one from NCURA, David and Ann said the Board was looking for a national standard and a more inclusive definition.

Professional development

Following from the general session on research development and future research leaders, discussion moved to the topic of professional development in RD.  For example, Notre Dame has a ‘Professional Specialists’ category for individuals not on the tenure track, while an audience member noted that at UW-Milwaukee, indefinite status (a kind of academic staff equivalent of tenure) may be attached to some positions. With regard to professional development for non-faculty administrative positions, Alicia noted that the institutional culture matters. Ann noted that RD professionals interact with many other groups and people, e.g., federal relations, communications, etc., which may provide other paths to professional development and career trajectories. There are also research development career opportunities outside of the academy; rather than working only in the ivory tower, it’s possible to engage with agencies that engage with the ivory tower. Panel members suggested that, as a profession, we think broadly about RD and how to engage, promote, and facilitate it.

They also discussed the importance of educating the broader research community about RD; members should work to write for publication, participate in conferences, and utilize open source venues, etc., to establish NORDP’s role.  NORDP could provide the structure for open-source publication and outreach. One example was given of a former VPR who is now VP for Economic Development; this change points to development of a new pillar at universities to engage industry, thus helping to connect research and economic activities. Industries often lack understanding of federal agencies and universities.

The conversation moved to evaluation of RD offices. NORDP recently did an evaluation of the UC Merced RD office. As a new institution, Merced wanted a review of their program. A staff member was hired in 2008 and the review happened in 2010. Two NORDP members conducted the evaluation, meeting with the VPR, deans, heads of research institutes, faculty, and the president. NCURA also did a review of the sponsored projects office.  The reports of both evaluation reports were made public. Although the reviews were different, they were complementary, outlining a clear path for growth of the offices. In the case of the RD office, staff has been increased from two to six.  In addition, based on the UC system analysis of research activity, UC-Merced showed a 57.5% increase from last fiscal year, which the director attributed in part of the peer review process. She concluded that having a peer review of the campus RD is valuable to the overall professional development of the office and the campus.

Another topic focused on what can we do to increase industry engagement. One suggestion was to work more closely with CFR and Technology Transfer staff. CFR staff do not necessarily discuss how to engage with industry and Technology Transfer may not think as broadly about RD; each also has different cultures.  It can be very informative to work with them.

Other suggestions for improving professional development processes included:

  • Adding elements to the membership survey, gathering information about individuals’ past experience and expertise in non-profit, for profit, and government sectors and explore ways to leverage the vast range of experience we know exists in our membership.
  • Develop career tracks/paths to help members envision ways to move through their careers.
  • Develop some kind of mechanism for taking stock of the expertise we have and articulating what we can do.

Overall, the point of the job is not to be boxed in and restricted.  Part of developing as a profession is explaining what RD is not is as important as explaining what it is. Looking at how other professions have developed over time, for example HR over the last 25+ years, may help us imagine development of both the field and individuals.

A related question was then raised: How do we recruit people to the RD field? One audience member mentioned her experience as a post-doc in RD; panelists suggested she write an article about that experience. Some universities do have similar post-docs; it was suggested that we try to identify where these exist, and how they are structured.  This could then be presented as part of next year’s conference, as one model for RD career development.  It was also suggested that stronger career and professional development sessions could be incorporated into the annual conferences.

Another focus was on refinement of mentoring in RD.  In addition to the traditional mentoring model, suggestions were for more across sector mentoring, such as academia and government.  The mentoring program also needs more volunteers to fulfill the requests made each year and the idea of mentoring may need to be broadened from the model of expert and novice to one of interaction of equals who bring new perspectives and “fresh eyes” to issues and situations. It may make sense to rename the mentoring program to reinforce this idea; the term may actually limit membership participation, particularly if members think they need to have extensive formal RD work experience.

Notes by Kari E Whittenberger-Keith.

Thank you, Kari!

NORDP 2014 Conference Notes: Small Investments, Big Impact

This session focused on Boise State University as a case study of an institution striving to move from teaching intensive to a research mission. They had started as a Junior College in 1932 and became a university in 1974, with 3 Masters Programs. In 2003, they hired a new president, Bob Kustra, who implemented a vision in 2005 to become a “Metropolitan Research University of Distinction.”  At that time, they had 65 Masters programs and 2 PhD programs.  In 2012, they developed a Strategic Plan to gain distinction as a research university and now have 10 PhD programs.

Initially, the central Division of Research had a very high-level view that faculty should just be submitting more proposals. The message was, “C’mon faculty!” and not surprisingly, this did not go over well with faculty. They then decided to look at institutional barriers to faculty engagement in securing external funding for their research. One helpful document is National Research Council’s Partnerships for Emerging Research Institutions Report of a Workshop, published in 2009. They identified several barriers to universities trying to make a similar transition: 1) there was insufficient reward for faculty who pursued research; 2) the teaching load is so high, it’s hard for faculty to find time to do research and pursue funding; and 3) there is limited administrative support for research and for pursuing research funding within these institutions. In order to become a successful research institution, they decided to address these barriers.

Faculty in the College of Health Science at Boise State came from a rich history as master educators, not researchers. Therefore, the absence of pilot work and publication history among faculty left them in a less competitive position for external funding proposals, and they needed a safe space to ask for help. In addition, new faculty hires were getting conflicting messages: department heads were focused on meeting the department’s teaching needs, but higher level administrators had research expectations for these faculty. Furthermore, without help, faculty who did submit grants were not generally successful, so they became discouraged.

Two tandem strategic initiatives were instituted at Boise State University to address identified barriers. The College of Health Sciences established an embedded research development office while a research development initiative was added to the central Office of Sponsored Programs. For each of these offices working individually, changing the culture would have been too heavy a lift, but they found that they could do much more together than separately, so they started working together. The advantage of this strategy was that Terri, working at the college level, understood the faculty member’s perspective and could get to know the faculty individually, while Kim, working at the institutional level, knew about higher level initiatives before they were rolled out and could promote higher level connections and strategic partnerships, for example, through senior administrators to the national labs and state agencies. This allowed strategic positioning at the institutional level while also helping to develop and assist faculty so that they could be competitive for funding.

Terri and Kim presented a matrix of activities they pursued to support organizational and culture change across three levels: 1) personal, 2) peer-to-peer, and 3) organizational/infrastructure. The matrix had two columns according to which office took on the main responsibility for spearheading the activity, the College-level Research Office or the Office of Sponsored Programs.  Examples are research trajectory planning (personal/College-level Research Office), networking events (peer-to-peer, OSP), and implementing faculty incentives to reward research (organizational/ OSP).  (Please see their slides for the full matrix.) They then discussed which of these activities worked well, which didn’t, and lessons learned.

Example activities that worked well:

  • Implemented faculty incentive pay program: Working with Department Heads, faculty who won grants could recover some of the salary savings generated from buying out their time to receive incentive payments.
  • Faculty mentoring: Seasoned faculty meet with junior faculty to discuss specific topics, for example, how to work with DOE.
  • Strategic Research Development Initiative: Institutional research development program, but not a seed grant program (which hadn’t really yielded external funding). This provided small levels of targeted investments to fund specific needs identified as bottlenecks or capacity limiters; for example the need for a particular piece of equipment to get preliminary data. Follow-up is important.
  • Individual assistance with research design: This helped faculty get to the point where they could be competitive.
  • Built up post-award infrastructure: This was a huge pain point. Post-award used to be part of Finance and Administration which resulted in disconnects between pre- and post-award management of sponsored programs. Faculty who did win awards became frustrated with administrative inefficiencies, which then became a disincentive for pursuing additional funding.   To improve the post-award infrastructure, the Division of Research absorbed the post-award administration function into the Office of Sponsored Programs. This restructure ensured consistency and continuity of service and put OSP in a stronger position to advocate for support of faculty research administration needs institutionally.

Some other things didn’t work well, and an important lesson learned is to evaluate these activities early and don’t be afraid to pull the plug if they aren’t working. Some examples they mentioned:

  • Faculty writing group: This was implemented at the request of the faculty, but after the first few meetings attrition became a problem. Terri quickly decided that there was not enough return on her investment of time, so she stopped this activity.
  • Internal peer review process: It turned out that faculty reviewers were either too gentle in their reviews because they knew the PI well and perhaps feared being identified, or they were too harsh, which wasn’t helpful to the PI.
  • Newsletter: Wasn’t being read.
  • Formal networking events: Faculty did not respond to these even though they were good about gathering at a local watering hole for social interaction. It probably didn’t help that there was no alcohol.

Lessons Learned and Successful Strategies

  • Don’t try “pushing a rope.” Top-down proposals with no faculty leadership are destined for failure. When upper administrators push these projects, it’s helpful to explain to them that a poorly developed proposal without faculty buy-in will hurt the institution’s reputation among reviewers and program officers. It’s a question of not being ready yet, not just passing up an opportunity.  The institution may be in a better position to compete in the next round. This explanation usually resonates with institutional leadership.
  • Don’t try to be all things to all people. Initially, Kim met with all departments and offered help to all comers. She tried to convince people who weren’t interested to submit. As a result she quickly became overwhelmed. She learned that it is better to focus on strengths in the institution, and this works better at the college and department level where it’s possible to get to know the faculty and their research better.
  • Establish your research development infrastructure early.  Don’t wait for successes and increased research activity to develop this infrastructure. At the college level, even just a 1 or 1.5 FTE commitment can make a big difference.
  • Educate up.  Because the institution is evolving, it is important to help the upper administrators, deans, etc. understand institutional barriers, the need for infrastructure, and research development strategies.
  • Don’t expect immediate results. The College of Health Sciences made a strategic decision to pull back from submitting a lot of proposals until they could be competitive. Terri and her colleague worked with a cohort of faculty over two years to help them get publications, do pilot studies, etc. so that they could be competitive for grants rather than encouraging them to submit when they weren’t in a competitive position yet. As a result, the number of proposals initially went down, but now it’s paying off, and they had $3.5M in proposal submissions in the last quarter.
  • Don’t be a mile wide and 1 inch deep.  They decided to refocus their research development to “work with the willing” and make small strategic investments to help develop research capacity, and that has paid off.

 Impact and Culture Change

  • They now have a larger pool of PIs who are competitive, with a critical mass in areas of research strength
  • More strategic hires based on research strengths, not just teaching
  • Momentum: This is supported by quantitative and qualitative measures of success, but it’s important for the administration to understand that value is beyond just the number of proposals that go out the door.
  • More institutional research capacity and space
  • Number of submitted proposals have doubled in the last 8 years
  • Increase in research expenditures of 87% from FY 2007 – 2013. Now 10 PhD programs

Takeaways for Others In a Similar Situation

  • Survey the landscape, and identify barriers and obstacles to research and pursuing research funding
  • Find leverage points; for example, develop relationships between college research development personnel and research development personnel in central offices such as the Office of Sponsored Programs; you can share communication, resources, experience, connections, and training which helps to align strategic priorities and institutional education.
  • Fail often in order to succeed faster – accelerated cycle of innovation (don’t be afraid to try things, but also pull the plug if they aren’t working and try something else); support from your administration is critical to allow the freedom to fail and try something else.
  • Conduct a critical appraisal of the ROI for the initiatives you try (example: Seed funding program – it didn’t result in external funding, so they moved to more strategic, targeted funding).
  • Repeat

Example of Cycle of Innovation

Terri and Kim concluded their presentation by walking through an example of how they worked together to vet and fund an internal grant project and how each of them was able to marshal resources (financial and administrative) to assess the project’s potential for future funding, address administrative and legal considerations efficiently, monitor the project, and lay the groundwork for successful deliverables.

Terri Soelberg, Director, College of Health Sciences Office of Research
Kimberly Page, Assoc. Director Office of Sponsored Projects


Scribe: Lucy Deckard

Thanks, Lucy!!

NORDP 2014 Conference Notes: Cost Effective Ways of Keeping Up With the Joneses

Bryan DeBusk and Paul Tuttle (Hanover Research)

Make the most of budget dollars; budgets are flat or being reduced, but there is still pressure to increase awards and increase services for faculty/staff.

Overview of Hanover Research and Presenter Backgrounds

Hanover: Full cycle proposal development background. Bryan: faulty that transitioned into grant development (no research office experience, but gives perspective and ability to learn from the roundtable discussion). Paul: Central Sponsored Projects Office in North Carolina (2 Historically Black Colleges/Universities and 1 Woman’s College); mostly advancing pre-pre-award (not called research development at that time); experience working with SRA, NCURA, and now NORDP.

Keys to Success

  1. 1.       Define Goals in Measurable Terms

Goals could include

  • # of submissions/awards
  • Average request and award size
  • Percent of Faculty/Staff seeking/receiving grants
  • Types of faculty/staff seeking receiving
  • Overall award amount—increasing this in single awards or submit large quantity of smaller grants
  • Award metrics by time frame or institutional unit
  • Expenditures
  • Using Top 25 University Criteria
  • “Success Rate”—this was argued as a poor metric since RD office is to provide help and if it is not weighted based on service might not be representative
  1. 2.       Know Available Resources

Once you know the goal(s), need to know what resources are available

  • Personnel, including number, experience and skills
  • Infrastructure, policy for supporting development and submission
  • Faculty capabilities
  • Funding for grantsmanship survey
  • Funding for consulting services/external support (able to extend services without having to hire new staff)
  • Libraries, databases/tools (some can be expensive)
  1. 3.       Maximize Use of Available Resources

Determine how to make the most of what you have and fill the gap between have and need

  • Facilitate the use of support
  • Broaden participation and exposure of the office
  • Identify and use institutional levers
  1. 4.       Make Use of Alternative Resources

Find are that is most ripe for cost effective usage

  • Identify other staff that can be utilized for proposal development
  • Use available funder resources
  • Leverage partnerships (i.e. library)
  • Online Resources—some are free, so can use strategic payments to supplement available resources.  Taxes pay for many resources, so not a direct cost to your office.
  • Professional Associations (NORDP, NCURA, SRA, etc.) have resources available to members
  • Colleagues and their Offices’ resources—maximize exposure—point toward their information(don’t just co-opt it)
  1. 5.       Other Suggestions 

Range from least to most costly and how difficult to use

  • Recruit funded senior faculty as mentors
    • One to one mentors
    • Mock reviews
    • RD advisory board
    • Workshop development and leadership (training workshops)—they are teachers and cost effective to present to a large group; flattered to be a part of the RD enterprise
    • Guidance on how to serve on panels
    • Share successful proposals
  • Implement/Expand Research Development Support Workshops
    • Grants 101
    • Federal Vs. Foundation Funding
    • Budgeting
    • Finding Partners, Collaborators, and Mentors
    • Grants A-Z (overview all areas of research administration to put a face to the task for faculty)
    • Time Management to Tenure (2nd year junior faculty to be more competitive to apply for grants, high qualitative feedback—faculty feels more in control)
    • Funding Trajectory Planning (agencies, funding mechanisms, timeline)—U of Michigan estimates that to put together one trajectory/road map takes ~5 person hours
    • Writing Seminars
      • Writing Clearly and Concisely for Grants (includes overviews of common sections such as Significance, Innovation, Specific Aims)
      • Grant Writing Course (1 semester for grads, post docs, early career faculty)
  • Getting Attendance at Workshops
    • Use food or other incentives to boost attendance (don’t always have to pay, just hold session over lunch in a place where food is available for purchase)
    • Identify “difference makers” who can encourage./compel attendance—people who get other people energized (especially senior faculty)
    • Announce broadly, but invite directly (ensure events are in newsletters, emails, and/or other announcements, but then contact faculty/staff individually to personalize invitation)
    • Stipend/payment for attendees (this is successful if tied to a specific outcome—must submit as a result)
    • Make attendees pay a small amount (psychologically motivated since they paid) 
  • Document Repository

Develop curated electronic/paper document repository (do a cost benefit on this, as can be time consuming and if faculty/staff will not benefit then spend effort elsewhere). Must be easily navigated. Can be protected on intranet, etc.

  • Successful proposals with reviewers
  • Unsuccessful proposals with reviews (these are more difficult to get, since faculty often believes that sharing this makes them vulnerable); Included funded and unfunded to same initiative to show how they are different
  • Utilize senior faculty to do their own highlighting of proposals before included in repository (demonstrate evolution from good to fundable)
  • Document conversations with funders and include in repository (with access at least for other RD staff)
  • RFP analysis
  • Sample budgets and other sample documents/templates (if you do not have these at your institution, direct to colleagues’ or agency pages that do have these)—e.g. NIAID’s All About Grants page
  • Webinars
    • Develop own websites (benefit is the ability to archive for future use and wide dissemination;  drawbacks include: not one on one interaction; if you have many respondents, but not everyone attends, cannot determine impact of efforts)
      • Can be restricted to campus or shared with sister institutions (potentially way to raise funds, if you charge for external access)
      • Invite people “in the know” to lead (Program Officers, etc.)—use online teaching tools to provide open facilities to crease webinars (see U of Missouri (at Columbia)’s Federal Funding Webinar)
      • Maintain an up-to-date archive
      • Leverage Institutional IT personnel to assist with creation/facilitation
  • Connect Faculty/Staff to free webinars (general, discipline-specific and advanced knowledge)
  • Develop Online Training Modules
  • Teach key skills or introduce policies, procedures tools
  • Identify if gaps exist, then good use of time to develop training to address these gaps/commons questions and challenges
  • Make available on demand
  • Develop At-a-Glace, indexed, and searchable versions of manuals
  • Demonstrate use at every opportunity (workshops, presentations, etc.)
  • Single page describing key points
  • FAQs
  • Ensure that all manuals that are available electronically are searchable (not just a scan of a document)—convert them to searchable PDF
  • If quick and easy to use and find online, people will use it—saves time for them and you
  • Collaborate with Other Offices at Your Institution
  • Essential to proposal development and submission and project management functions
  • Other offices have budgets for their missions, so if can leverage this, save money for your office
  • Coordinate your needs with their missions
  • You receive services because it is their job
  • Marketing (e.g. profiling researchers to show university capabilities, webinar/online training)
  • Other
    • Collaborate with Development and Industry Collaboration Offices (provide start-up funds for young faculty; leverage Development knowledge)

    Scribe: Alicia Reed

Thank you, Alicia!

NORDP 2014 Conference Notes: Funding opportunities in the arts, humanities and social sciences: strategies for supporting and promoting a grant- seeking culture

Presenters: Susan Gomes (Harvard University), Barbara Walker (University of California at Santa Barbara) and Caitlin McDermott-Murphy (Harvard University)

Noting that the grant proposal writing culture is not ubiquitous across academic disciplines, the three speakers delivered a three-pronged presentation: why seeking grant support is important for arts, humanities and social sciences scholars, what the funding landscape looks like for these disciplines, and how to establish a culture of grant proposal writing. Successfully funded scholars benefit both the institution (possibility of securing F&A costs and institutional prestige) and themselves  (possibility of summer salary or reassigned time, raising visibility about scholarship and having that scholarship validated through the peer review process, and the opportunity to create or expand a scholarly network).

The presenters discussed major federal funders in the humanities and arts areas, including the National Endowment for the Arts, National Endowment for the Humanities, National Archives and Records Administration (National Historical Publications and Records Commission), and the Institute for Museum and Library Services.  Social sciences researchers can look to funders like the National Science Foundation, the National Institutes of Health, and the Departments of Education, Defense and Justice.  Private funders (foundations), funding at the state level, and Foundation Center tools and reports also were discussed, along with descriptions of resources like H Net Online and a forthcoming book by Barbara Walker on this topic.

Several strategies for promoting a culture of grantsmanship were shared, including programming (workshops led by research development professionals and faculty); sponsor campus visits; developing partnerships with academic deans and other key figures; continuous outreach to faculty; funding opportunity dissemination; and, faculty surveys (for the purpose of eliciting feedback while advertising services).  The presenters concluded the session by reminding participants to leverage resources on their campuses in support of arts, humanities and social sciences faculty, noting that “Everything doesn’t have to cost something.”

Scribe: Pollyanne Frantz

Thanks, Pollyanne!