NORDP 2016 Conference Notes: Empowering women leaders in research through alternative pathways

This post is part of our NORDP 2016 Conference Reports. These reports capture the take-home points from a variety of sessions presented at the NORDP Annual Meeting in Orlando.

Empowering women leaders in research through alternative pathways

Presenters: Alicia Knoedler
Key points from the session:
  1. About 74 – 76% of NORDP members are women.
  2. Leadership Development in Research Development (LDRD) – The skills developed from RD activities qualify RD professionals for leadership roles within higher education institutions and/or within NORDP.
  3. As an RD professional, you are already following the 5 tenets of leadership as defined by Ron Heifetz (these are listed in the session’s Powerpoint presentation).
  4. To be successful in RD today, you must be entrepreneurial, creative, innovative, and not afraid to take risks.
  5. NORDP will work to identify the broad base of skills/ ideas/needs for LDRD. The question of how we, as a professional organization, can empower RD leaders was discussed.
What resources did you discover at this presentation?
Dispatches from 20 North Wacker
A white paper that outlines the concepts and missions of both NORD (New Opportunities for Research Development) and LDRD. (You must be a NORDP member to access this document.)
What else from this session should NORDP members know?
  • It’s not always the RD professional who gets the recognition; RD professionals often lead in the background – working with faculty leaders.
  • Potential next steps were discussed: NORDP could have leadership development cohorts within their LDRD programming. A NORDP curriculum for leadership development could be advanced. The organization could promote leadership development opportunities within the NORDP community – best practices – annotated experiences.

If you are interested in joining the discussion on LDRD, let Alicia know.

NORDP 2016 Conference Notes: Developing research capacity and grant readiness in investigators

This post is part of our NORDP 2016 Conference Reports. These reports capture the take-home points from a variety of sessions presented at the NORDP Annual Meeting in Orlando.

Session Scribe: Karen Fletcher

Developing Research Capacity and Grant Readiness in Investigators

Presenters: Marjorie Piechowski and Sarah Polasky

Key points from the session:

  1. Get yourself involved in Faculty Orientation.
  2. Find out how much managerial experience/personnel awareness (HR) your new researcher has – most do not have any; and then provide guidance.
  3. Provide templates on anything you can.
  4. Consider providing editorial assistance for publications.
  5. All recommendations depend on context.

What did you hear at this presentation that surprised you?

The suggestion to host a workshop for graduate students before they leave your institution to train the next generation of faculty (focus on managerial skills).

What resources did you discover at this presentation?

An Assessment Tool: PI Grant Readiness, worksheet/list. This is a self-assessment for PI’s on how much preparation they had already completed in order to be competitive for a grant; this could be used as a talking point with junior faculty. Contact presenters (sarah.polasky@asu.edu and piechow4@uwm.edu) for a copy.

What else from this session should NORDP members know?

  • Don’t scare faculty with too much information – consider providing them with no more than 5 funding opportunities that are due within the next 6 months.
  • Find out if your new researcher has a research plan with their mentor. Junior faculty usually know little about what grants have been awarded in their area – help them identify those.
  • Grant Readiness should include: 1) Strategic Planning for Research Funding; 2) Ability Assessment; 3) Mentoring Support (individual/internal or external), Departmental, Institutional; and 4) Logistics (lab space, how do you fill out a purchase order, etc).
  • After creating a Strategic Research Grant Plan for a faculty member, re-meet with them after a year to update the plan. Consider asking for a report from them.
  • Provide them project management support.

NORDP 2016 Conference Notes: Demystifyng the U.S. Dept. of Education

This post is part of our NORDP 2016 Conference Reports. These reports capture the take-home points from a variety of sessions presented at the NORDP Annual Meeting in Orlando.

Session Scribe: Kristin Wetherbee

Demystifying the U.S. Department of Education

Presenter: Marjorie Piechowski

Key points from the session:

  • The U.S. Dept. of Education isn’t very consistent with funding opportunities. Programs may not be offered every year and there are few established due dates. Formatting and page limit requirements can vary. Also, some submissions must go through grants.gov while other must go through the U.S. Department of Education’s e-grants system.
  • Notices are announced via the Federal Register and the U.S. Dept. of Education website with a minimum 30 day notice (often, only 30 days’ notice is given).
  • Proposals should cite literature from the National Clearinghouse which holds documents about the current state of research.
  • Proposal components:
    • Personnel – must adequately describe role and credentials relative to the proposal
    • Project design and need – often weighed most heavily by reviewers
    • Adequacy of resources – need to address the specifics of what you’re asking for (cost per student, cost share, institutional resources)
    • Evaluation – often weighted heavily, up to 20% of total points. The Department seems to prefer external evaluators so you must provide excellent justification if using your own evaluation tool.
    • Special and competitive priorities – these may or may not be required. Bonus points may be given for addressing them so don’t make reviewers hunt for this language; state clearly and boldly in the proposal.
  • Program officers don’t have to be experts in the field and some PI’s have found that program officer comments are in direct conflict with what the review committee wants.

What did you hear at this presentation that surprised you?

The consistency of the Department’s lack of consistency.

What resources did you discover at this presentation?
What else from this session should NORDP members know?
You must routinely visit the U.S. Dept. of Education website to stay current on offerings and deadlines and must thoroughly review calls for proposals for changes from year to year. Also, if you’re interested in being a reviewer for the U.S. Dept. of Education, a Ph.D. is not required (master’s preferred). Register at http://www.g5.gov/.

NORDP 2015 Conference Report: Strategies to Support Multi-Institutional, Cross-Conference Research Collaborations

by Marilyn Korhonen, Ed.D.
Associate Director Center for Research Program Development and Enrichment
Office of the Vice President for Research, University of Oklahoma

Panelists: Martha Cooper and Nathan Meier

The Traumatic Brain Injury Project is a great example of being ready to seize an opportunity, and it serves as a lesson to watch for such opportunities and be flexible enough to respond. One important aspect of the project is the timeliness and critical need to address concerns about the impact of more than 3 million brain injuries that occur each year, many of which are associated with college athletics programs.

A second critical factor is the presence of an established organizing structure. In particular, the Big Ten Athletic Conference and the Ivy League came together to improve traumatic brain injury prevention, detection, and treatment strategies. While this project aligns with athletic conferences, it is enabled by the Committee on Institutional Cooperation (CIC), which is a consortium of the Big Ten member universities plus the University of Chicago. The CIC has been in place for more than 50 years, enabling the member institutions to advance their academic missions by sharing expertise, leveraging campus resources, and collaborating on innovative programs.

A third factor is the ability to govern, fund, and staff the project quickly and equitably. CIC is governed and funded by the Provosts of the member universities, and coordinated by a staff from its Champaign, Illinois headquarters. Thus the project had a natural, established home. This governing body allowed for focused goals and focused approaches implemented in coordinated ways.

Most of these factors exist primarily outside of research development. So a fourth important factor is to make a case for research and scholarship, and to leverage the resources established for the program. In this case, having a larger sample size of athletes with potential traumatic brain injuries enabled use of evidence-based, clinical protocols. These protocols may lead to collaboration with the Department of Defense, allowing for comparison of TBI based on a greater number of factors.

Finally, the University of Nebraska was in a position to provide leadership as well as physical resources to create a Center for Brain Biology and Behavior, which is attached to their Athletics Performance Lab (all within the football stadium). This strengthens their research program and provides even greater resources to the overall TBI Project. This project has already resulted in 22 research collaboration efforts and 12 distinct sources of funding.

Some of the challenges include:

  1. Increased competition for limited federal funds
  2. Balancing tensions between collaboration and competition
  3. Lack of equity in the institutional contributions of seed funding and other support toward the project.

Ultimately, the presenters expect that a strong focus on common goals will be the key to the success of their project.

NORDP 2015 Conference Report: Building an NIH Portfolio Without a Local Medical School

By Karen Markin, PhD, Director of Research Development, University of Rhode Island
To build a portfolio of grants from the National Institutes of Health at an institution without a medical school, it is essential to understand the agency’s mission, according to Janet E. Nelson, associate vice chancellor for research development at the University of Tennessee. That mission is to seek knowledge that enhances health, lengthens life and reduces illness and disability.

Nelson was one of three panelists who discussed strategic planning for successful grant-seeking from NIH in an increasingly competitive environment. The panel was part of NORDP’s annual Research Development Conference in Bethesda, MD.

Award rates at NIH are falling, noted Jennifer L. Webster, manager of strategic research initiatives at the University of Tennessee. However, it is still making grants focused on certain initiatives, including precision medicine, antibiotic resistance, cancer, brain research, Alzheimer’s disease and new vaccines.

Institutions without medical schools can compete by focusing on their strengths relative to other institutions. Panelists urged participants to think about the unique strengths of their institutions. For example, panelists Meredith Murr said the University of California at Santa Barbara has a top engineering department with talents it can leverage into NIH awards. The institution also hires strategically, focusing on medical researchers, and buildings collaborations outside the university.

Other tips from the panelists:

  • Invite an NIH program officer to speak at your campus.
  • Organize quarterly networking events and involve off-campus groups
  • Conduct red-team reviews on grant proposals.
  • Offer proposal development workshops.

NORDP 2015 Conference Report: Preparing Competitive STEM Education Development Proposals: Planning for Sustained Adoption

By Vanity Campbell
Proposal development and project management of large STEM education proposals often lack design elements to ensure sustained adoption of successful programs. The presenters shared 6 key best practices to increase the impact and systemic change anticipated of such proposals. In the past, funding agencies have encouraged a solitary cyclic model for STEM education improvement based on research, evaluation, and implementation of innovative methods.  This system relied on isolated, individualized development of new methods by researchers with limited outside feedback and involvement.  The dissemination of program outcomes was often passive utilizing conferences, websites, and publication to share program results.  As a result, proposed solutions were unique to specific institutions and lacked transferability and scalability.

New federal agency trends are emerging in STEM education development to address limited adoption and portability and broaden dissemination. To identify best practices, the presenters researched successful practices by analyzing 75 NSF CCLI grant proposals funded in 2009, case studies of well-propagated innovations (PhET, PLTL, Peer Instruction), and a review of recent related literature.  The results showed that effective propagation requires 6 key elements:

  • identification of potential adopters,
  • extensive plan for attracting, training, and supporting adopters,
  • addresses propagation early while the program is ongoing,
  • relevant instructional system elements identified,
  • provides a clearly identified plan, with rationale and strategy defined,
  • innovation, potential adopters, and selected strategies are aligned

As a hands-on activity, the presenters led workshop participants through an evaluation of a proposed STEM education proposal using an assessment instrument focusing on project type, target curricula, propagation activities, and plans.  Participants reviewed and assessed two different proposal project summaries, and compared evaluation ratings and comments.  From this exercise, participants learned that successful propagation has an intended audience, engages users, propagation plans are initiated at onset, the plan consists of an instructional system, clear and thorough plan and strategy.

Successful propagators identify potential adopters, interact with them, and support them. To achieve this, proposal planning requires interactive development, interactive dissemination, and support at three levels: individual, department, and institution.  Interactive development will include partner institutions, advisory boards and beta testing.  In comparison, isolated development involves primarily institutional stakeholders.  An interactive dissemination plan will consists of immersive workshops, leverage professional societies, pilot sites, and foster scholarship in other faculty.  Where as static dissemination should be avoided, such as dissemination of results via articles and webistes.  Lastly, adequate support will assist adopters by use of networks, customizable materials, and consultation.  This ensures successful adoption in contrast to adopters implementing new methods in isolation with no support for addressing challenges.

Implementing strong propagation plans can strengthen STEM education proposals and ensure sustained adoption and successful impact of active programs.

For contacts and additional information, see www.increasetheimpact.com.

NORDP 2015 Conference Report: Innovations in Research

By Lucy Deckard
Presenters: This session was presented by Margaret Hilton (National Research Council), James Gentile (Hope College) and Kara Hall (National Cancer Institute and member of National Research Council ) Margaret Hilton gave an overview:  This session discussed a series of reports related to Innovations in Scientific Research (specifically, interdisciplinary and team science), the latest of which came out in late April, 2015.

  • “Facilitating Interdisciplinary Research” (2005)
  • “Convergence: Facilitating Transdisciplinary Integration of Life Sciences, Physical Sciences, Engineering and Beyond” National Research Council (2014)
  • “Enhancing the Effectiveness of Team Science” National Research Council (2015)

The first report defined “interdisciplinary” vs. “transdisciplinary” (transcends disciplinary boundaries). In theory, an individual can conduct these kinds of research by him/herself, but in reality that rarely happens. This is where you get to the realm of Team Science – science conducted interdependently by more than one person.

These reports came up with some common recommendations for changes needed to promote and accommodate these new ways of doing scientific research:

  • Revise promotion and tenure policies
  • Expand funding mechanisms and review criteria
  • Conduct research/evaluation to understand and guide improved interdisciplinarity and convergence in science

James Gentile spoke about ”convergence”:

Scientific research is becoming more problem-centered. Mother Nature is winning, and she has no departmental structural constraints. In order to solve complex questions in science we need true innovation and interdisciplinary collaboration. In addition, tools in science are exploding, bringing disciplines together.

The grand challenges that we want to converge about include:

  • Green energy
  • Chemistry and physics of living systems
  • Synthetic capacity of live
  • -omics to uncover new approaches to disease
  • Others were also listed.

The Research Corp, Howard Hughes Medical Institute and others: Science Coalition coming together and made a context map   Addressing these problems means we have to go through a web that includes law, policy, economics, as well as virology .

So in the future we will have to learn how to converge. For example, brain mapping requires a lot of different types of expertise. In interdisciplinary research, it’s usually altruistic. A researcher takes a sample to a colleague in chemistry and asks if she’ll run it on her machine. She does this as a favor. In contrast, when we converge, I get my question answered but that answer presents a new question for the colleague in chemistry. In this case, everyone is growing. They coined a new term: “Scialog,” from science and dialog.

The Research Corporation for Science Advancement brought together researchers to consider a national priority: energy from photosynthesis (in essence, can we build an artificial tree based on nanotechnology). They invited researchers to form teams, but before they pitched the science, the Research Corporation just wanted to hear the justification for the team composition. They ended up funding a group that created bio-inspired silicon photovoltaics. Convergence also works in education. He gave the example of having students design robotic “cockroaches”. See https://www.youtube.com/watch?v=JysIA-4fcA4 and NRC, 2014.

Kara Hall talked in more detail about the recently released report, “Enhancing the Effectiveness of Team Science”:

The committee looked at factors that impacted effectiveness of science teams:

  • Individual factors
  • Factors at team/center/institute level (organizational factors)
  • Management approaches and leadership styles
  • How tenure and promotion are affected
  • etc.

The team included people with a broad range of backgrounds, including psychologists, biologists, social scientists, etc. They used several measures to evaluate effectiveness, including which research is cited more and which yielded more patents. They found that research done in teams is cited more, yields more patents, and demonstrates high levels of innovation.

They defined the following terms:

Team science – collaborative, interdependent research conducted by more than one individual

Science team – 2 – 10 individuals

Larger group – more than 10 (teams of teams)

Team effectiveness – a team’s capacity to perform

Key features that cause more challenges for team science are large membership diversity, the need to effect deep knowledge integration, (sometimes) large size, goal misalignment, permeable boundaries for teams (meaning members may move in and out as the research evolves), geographic dispersion, and high task interdependence. The concluded that there is already a strong body of research on team processes as they relate to  effectiveness, but most of that research was done on teams such as business teams (not science research teams), so we need to bring that literature into the context of science.

They identified three main areas where interventions could enhance effectiveness: team composition, team professional development, and team leadership. Kara summarized several recommendations based on current research in each of these areas.

Composing your team: Consider using task analytic methods to identify needed knowledge, skills and attitudes. These methods can be used to match task-related diversity among team or group members.  Also, consider moving outside your usual network – for example by leveraging networking tools.

Team Professional Development: Team professional development models prevalent in business could be applied to science. The committee recommended that we look at these models to see what’s out there and develop them so that they are relevant to science teams. When dealing with diverse teams and trying develop shared knowledge, it’s very important to devote time to developing a shared vocabulary. This may seem like it takes a lot of time, but in the end it will be worth it.

Leadership: The committee noted that there is already fifty years of research on teams and organizational leadership, so we should take advantage of this robust foundation and adapt it for leaders of science teams and larger groups.

The team also recommended that, in order to address the challenges of geographic dispersion, we conduct research on virtual collaboration and geographically dispersed science teams. They also recommended that dispersed teams consider task assignments within semi-independent units at each location to reduce the burden of constant electronic communication.

The team also concluded that while universities have launched new efforts to promote interdisciplinary team science (e.g., getting rid of departments), the impact of these initiatives on the amount and quality of team science has not be systematically evaluated. It may not be that the main hurdles are not actually disciplinary structures but may instead be factors such as promotion and tenure (P/T) criteria.

This points out the importance of aligning reward structures with encouraging team science. Many university P/T review policies are uncreative  and don’t give credit for team-based research. One model might be big physics, where research has long been done in very large teams. They allow researchers to get credit for pre-publications (e.g., software, databases, etc.), not just first-authored publications.

Funding agencies also need to play a role in encouraging a culture change in the scientific community. The report recommends that funders encourage development and implementation of new collaborative models (e.g., research networks, consortia). They also need to support the development of resources that support team science (e.g., info repositories, training modules, ensuring data is available for mining). We also need more targeted research about team science, but few funding programs support this research.

She also recommended that folks attend the SciTS (science of team science) 2015 conference in June 2 -5, 2015 in Bethesda, MD.

Questions and Answers:

Question: Are more diverse teams more difficult to manage?

  • Faculty are a non-pack-oriented group – leadership and administrative oversight can be difficult
  • Teams should start out small – then they can grow as they demonstrate success
  • In the team science report there was lots of discussion around diversity – we will see evolving culture shifts – more emphasis in education. A new Chief of Science Workforce Diversity at NIH was just named. As get used to more diversity, it will get easier.
  • There will be a meeting targeting provosts and deans by the National Academy highlighted their role in addressing some of these team science issues (e.g., authorship is an example  if diff disciplines have different authorship criteria)

Question:  Are there recommended strategies for forming teams?

  • When you have a team that has worked together and then bring in some new people, that often works best. If you’ve been working together too long, you can lose your innovative edge. Sometimes concatenating two teams also works.
  • When there’s too much competition among teams, it degrades the teams’ ability to work with one another, which may be needed in the future. One way to address this is to develop large networks and initiatives that include multiple centers to foster collaboration.
  • NIH is offering the opportunity to bring together scientists to think about a problem space – not a commitment. That way, teams can form and when there is an application, it’s more sophisticated
  • Another strategy is to form a team around teaching – if they can work together around teaching, a lot of research can come out of that, and it’s a way for the members to get used to working with one another (for example, developing innovative interdisciplinary non-major courses)

Question: Can core facilities help bring teams together?

  • You have to find a way to stimulate conversation. Some core facilities work better than others to stimulate interdisciplinary collaborations. The  core needs to understand that this is one of their roles.
  • Sometimes the core is competing with the people they are supposed to be supporting – they need to get rewarded for bringing people together.

The presenters thanked their sponsors: NSF and Elsevier