NORDP 2014 Conference Notes: Small Investments, Big Impact

This session focused on Boise State University as a case study of an institution striving to move from teaching intensive to a research mission. They had started as a Junior College in 1932 and became a university in 1974, with 3 Masters Programs. In 2003, they hired a new president, Bob Kustra, who implemented a vision in 2005 to become a “Metropolitan Research University of Distinction.”  At that time, they had 65 Masters programs and 2 PhD programs.  In 2012, they developed a Strategic Plan to gain distinction as a research university and now have 10 PhD programs.

Initially, the central Division of Research had a very high-level view that faculty should just be submitting more proposals. The message was, “C’mon faculty!” and not surprisingly, this did not go over well with faculty. They then decided to look at institutional barriers to faculty engagement in securing external funding for their research. One helpful document is National Research Council’s Partnerships for Emerging Research Institutions Report of a Workshop, published in 2009. They identified several barriers to universities trying to make a similar transition: 1) there was insufficient reward for faculty who pursued research; 2) the teaching load is so high, it’s hard for faculty to find time to do research and pursue funding; and 3) there is limited administrative support for research and for pursuing research funding within these institutions. In order to become a successful research institution, they decided to address these barriers.

Faculty in the College of Health Science at Boise State came from a rich history as master educators, not researchers. Therefore, the absence of pilot work and publication history among faculty left them in a less competitive position for external funding proposals, and they needed a safe space to ask for help. In addition, new faculty hires were getting conflicting messages: department heads were focused on meeting the department’s teaching needs, but higher level administrators had research expectations for these faculty. Furthermore, without help, faculty who did submit grants were not generally successful, so they became discouraged.

Two tandem strategic initiatives were instituted at Boise State University to address identified barriers. The College of Health Sciences established an embedded research development office while a research development initiative was added to the central Office of Sponsored Programs. For each of these offices working individually, changing the culture would have been too heavy a lift, but they found that they could do much more together than separately, so they started working together. The advantage of this strategy was that Terri, working at the college level, understood the faculty member’s perspective and could get to know the faculty individually, while Kim, working at the institutional level, knew about higher level initiatives before they were rolled out and could promote higher level connections and strategic partnerships, for example, through senior administrators to the national labs and state agencies. This allowed strategic positioning at the institutional level while also helping to develop and assist faculty so that they could be competitive for funding.

Terri and Kim presented a matrix of activities they pursued to support organizational and culture change across three levels: 1) personal, 2) peer-to-peer, and 3) organizational/infrastructure. The matrix had two columns according to which office took on the main responsibility for spearheading the activity, the College-level Research Office or the Office of Sponsored Programs.  Examples are research trajectory planning (personal/College-level Research Office), networking events (peer-to-peer, OSP), and implementing faculty incentives to reward research (organizational/ OSP).  (Please see their slides for the full matrix.) They then discussed which of these activities worked well, which didn’t, and lessons learned.

Example activities that worked well:

  • Implemented faculty incentive pay program: Working with Department Heads, faculty who won grants could recover some of the salary savings generated from buying out their time to receive incentive payments.
  • Faculty mentoring: Seasoned faculty meet with junior faculty to discuss specific topics, for example, how to work with DOE.
  • Strategic Research Development Initiative: Institutional research development program, but not a seed grant program (which hadn’t really yielded external funding). This provided small levels of targeted investments to fund specific needs identified as bottlenecks or capacity limiters; for example the need for a particular piece of equipment to get preliminary data. Follow-up is important.
  • Individual assistance with research design: This helped faculty get to the point where they could be competitive.
  • Built up post-award infrastructure: This was a huge pain point. Post-award used to be part of Finance and Administration which resulted in disconnects between pre- and post-award management of sponsored programs. Faculty who did win awards became frustrated with administrative inefficiencies, which then became a disincentive for pursuing additional funding.   To improve the post-award infrastructure, the Division of Research absorbed the post-award administration function into the Office of Sponsored Programs. This restructure ensured consistency and continuity of service and put OSP in a stronger position to advocate for support of faculty research administration needs institutionally.

Some other things didn’t work well, and an important lesson learned is to evaluate these activities early and don’t be afraid to pull the plug if they aren’t working. Some examples they mentioned:

  • Faculty writing group: This was implemented at the request of the faculty, but after the first few meetings attrition became a problem. Terri quickly decided that there was not enough return on her investment of time, so she stopped this activity.
  • Internal peer review process: It turned out that faculty reviewers were either too gentle in their reviews because they knew the PI well and perhaps feared being identified, or they were too harsh, which wasn’t helpful to the PI.
  • Newsletter: Wasn’t being read.
  • Formal networking events: Faculty did not respond to these even though they were good about gathering at a local watering hole for social interaction. It probably didn’t help that there was no alcohol.

Lessons Learned and Successful Strategies

  • Don’t try “pushing a rope.” Top-down proposals with no faculty leadership are destined for failure. When upper administrators push these projects, it’s helpful to explain to them that a poorly developed proposal without faculty buy-in will hurt the institution’s reputation among reviewers and program officers. It’s a question of not being ready yet, not just passing up an opportunity.  The institution may be in a better position to compete in the next round. This explanation usually resonates with institutional leadership.
  • Don’t try to be all things to all people. Initially, Kim met with all departments and offered help to all comers. She tried to convince people who weren’t interested to submit. As a result she quickly became overwhelmed. She learned that it is better to focus on strengths in the institution, and this works better at the college and department level where it’s possible to get to know the faculty and their research better.
  • Establish your research development infrastructure early.  Don’t wait for successes and increased research activity to develop this infrastructure. At the college level, even just a 1 or 1.5 FTE commitment can make a big difference.
  • Educate up.  Because the institution is evolving, it is important to help the upper administrators, deans, etc. understand institutional barriers, the need for infrastructure, and research development strategies.
  • Don’t expect immediate results. The College of Health Sciences made a strategic decision to pull back from submitting a lot of proposals until they could be competitive. Terri and her colleague worked with a cohort of faculty over two years to help them get publications, do pilot studies, etc. so that they could be competitive for grants rather than encouraging them to submit when they weren’t in a competitive position yet. As a result, the number of proposals initially went down, but now it’s paying off, and they had $3.5M in proposal submissions in the last quarter.
  • Don’t be a mile wide and 1 inch deep.  They decided to refocus their research development to “work with the willing” and make small strategic investments to help develop research capacity, and that has paid off.

 Impact and Culture Change

  • They now have a larger pool of PIs who are competitive, with a critical mass in areas of research strength
  • More strategic hires based on research strengths, not just teaching
  • Momentum: This is supported by quantitative and qualitative measures of success, but it’s important for the administration to understand that value is beyond just the number of proposals that go out the door.
  • More institutional research capacity and space
  • Number of submitted proposals have doubled in the last 8 years
  • Increase in research expenditures of 87% from FY 2007 – 2013. Now 10 PhD programs

Takeaways for Others In a Similar Situation

  • Survey the landscape, and identify barriers and obstacles to research and pursuing research funding
  • Find leverage points; for example, develop relationships between college research development personnel and research development personnel in central offices such as the Office of Sponsored Programs; you can share communication, resources, experience, connections, and training which helps to align strategic priorities and institutional education.
  • Fail often in order to succeed faster – accelerated cycle of innovation (don’t be afraid to try things, but also pull the plug if they aren’t working and try something else); support from your administration is critical to allow the freedom to fail and try something else.
  • Conduct a critical appraisal of the ROI for the initiatives you try (example: Seed funding program – it didn’t result in external funding, so they moved to more strategic, targeted funding).
  • Repeat

Example of Cycle of Innovation

Terri and Kim concluded their presentation by walking through an example of how they worked together to vet and fund an internal grant project and how each of them was able to marshal resources (financial and administrative) to assess the project’s potential for future funding, address administrative and legal considerations efficiently, monitor the project, and lay the groundwork for successful deliverables.

Terri Soelberg, Director, College of Health Sciences Office of Research
Kimberly Page, Assoc. Director Office of Sponsored Projects


Scribe: Lucy Deckard

Thanks, Lucy!!

Author: Julie Rogers

Research Development Associate, Oregon Health & Science University

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: