Letter from an Editor

The following emails are real correspondence between myself and an Associate Editor at a prestigious scientific journal, which I have called “Journal X” here, after I published an article in a journal with a very similar title, which I am calling “Journal Y”. I have removed the journal and editor names for privacy.

letter from an editor
Photo by Scott Kidder

Dear Dr. Adamson,

Congratulations on your recent excellent paper. May I ask if you specifically targeted your paper for the “Journal Y” (a relatively new “look-alike” entry to the field in 2013)?  I’m an Associate Editor for the more established “Journal X.” I was curious to what extent authors may be submitting to other “look-alike” journals to intentionally vs. not.


Dr. Editor

Associate Editor

“Journal X”


Dear Dr. Editor,

Thank you for contacting me about this recently published paper. I am grateful for the opportunity to explain why I chose to submit to the open access “Journal Y” instead of the well-respected “Journal X” and I would like to hear your thoughts. I am cautious about predatory journals, and I consciously wrestled with the pros and cons in journal selection.

I originally prepared this manuscript for submission for “Journal X,” where I thought it would fit nicely in one of the online-only sections. When the manuscript was finalized, I received an invitation to contribute to a special issue of “Journal Y” that included a publication fee waiver.

Having never heard of this look-alike journal before, I read and considered these elements: credibility, cost, ethics, and time to the wide-spread dissemination of the findings.

First, I checked the credibility of “Journal Y” by reading past issues. The quality was good, and they had published a review paper in 2014 by my research heroes Drs. A and B. I thought, “if this journal is good enough for A and B, it is certainly good enough for me.”

I noticed the “Journal Y” had very low or no publication fees for being Open Access, and this signaled a difference from predatory journals with low standards and high profits. Since my current trainee funding does not cover publication fees, the paid Open Access option through your journal was not an option unless I used personal savings.

The University of Washington Biomedical Research Integrity Series lectures on the ethics of responsible publishing changed my opinions about Open Access vs subscription journals. I take seriously my moral responsibility as an HIV researcher to communicate findings in a format accessible to scientists and patients in communities disproportionately affected by this disease. Many of my brilliant economist and mathematician colleagues are based at small institutions and companies with limited funding to purchase articles.

At the end, I am pleased to have chosen “Journal Y” over “Journal X” for several reasons:

  • Less than 8 weeks passed from my submission to publication (including two rounds of revisions)
  • I received rigorous and helpful peer-review comments
  • No cost for Open Access (with invitation to the special issue, or low cost otherwise)
  • Within the first month, the full text has been downloaded more than 200 times in six continents.
  • Last year I submitted a different and very good, in my opinion, paper to your journal. After more than two months, I received a rejection from with peer-review comments that were so mean and personal it was borderline unprofessional. While this look-alike journal does not have the prestige or impact factor of X, I see my younger generation progressively placing more value on the quality of content, free accessibility, dialogues, and citations of individual papers rather than on the sum of journal impact factors on a CV.

Because of these reasons, I am committed to Open Access publishing when I have the opportunity and sufficient funds to do so. I admit there is quite a lot for me to still learn about scientific publications, and so I would greatly appreciate your expert feedback on these considerations.

Thank you for taking the time to read my paper and reach out to me personally with questions.


Blythe Adamson


Dear Dr. Adamson,

Thank you very much for your thoughtful reply.  I’m glad to hear that this was an intentional (vs. accidental) decision. Please also accept my apologies for the “borderline unprofessional” peer review comments you received when submitting to “Journal X”; definitely discouraging to authors.

I will pass on your comments to the other Associate Editors of “Journal X” on our quarterly conference call for discussion on how we can improve and hopefully attract your future papers.

Best regards,

Dr. Editor

Associate Editor

“Journal X”

Reminders About Propensity Scores

Propensity score (PS)-based models are everywhere these days.  While these methods are useful for controlling for unobserved confounders in observational data and for reducing dimensionality in big datasets, it is imperative that analysts should use good judgement when applying and interpreting PS analyses. This is the topic of my recent methods article in ISPOR’s Value and Outcomes Spotlight.

I became interested in PS methods during my Master’s thesis work on statin drug use and heart structure and function, which has just been published in Pharmacoepidemiology and Drug Safety. To estimate long-term associations between these two variables, I used the Multi-Ethnic Study of Atherosclerosis (MESA), an observational cohort of approximately 6000 individuals with rich covariates, subclinical measures of cardiovascular disease, and clinical outcomes over 10+ years of follow-up. We initially used traditional multivariable linear regression to estimate the association between statin initiation and progression of left ventricular mass over time but found that using PS methods allowed for better control for unobserved confounding. After we generated PS for the probability of starting a statin, we used matching procedures to match initiators and non-initiators, and estimated an average treatment effect in the treated. Estimates from both traditional regressions and PS-matching procedures found a small, dose-dependent protective effect of statins against left ventricular structural dysfunction. This finding of very modest association contrasts with findings from much smaller, short-term studies.

I did my original analyses using Stata, where there are a few packages for PS including psmatch2 and teffects. My analysis used psmatch2, which is generally considered inferior to teffects because it does not provide proper standard errors. I got around this limitation, however, by bootstrapping confidence intervals, which were all conservative compared with teffects confidence intervals.

Figure 1: Propensity score overlap among 835 statin initiators and 1559 non-initiators in the Multi-Ethnic Study of Atherosclerosis (MESA)

Recently, I gathered the gumption to redo some of the aforementioned analysis in R. Coding in R is a newly acquired skill of mine, and I wanted to harness some of R’s functionality to build nicer figures. I found this R tutorial from Simon Ejdemyr on propensity score methods in R to be particularly useful. Rebuilding my propensity scores with a logistic model that included approximately 30 covariates and 2389 participant observations, I first wanted to check the region of common support. The region of common support is the overlap between the distributions of PS for the exposed versus unexposed, which indicates the comparability of the two groups. Sometimes, despite fitting the model with every variable you can, PS overlap can be quite bad and matching can’t be done. But I was able to get acceptable overlap on values of PS for statin initiators and non-initiators (see Figure 1). Using the R package MatchIt to do nearest neighbor matching with replacement, my matched dataset was reduced to 1670, where all statin initiators matched. I also checked covariate balance conditional on PS in statin initiator and non-initiator groups. Examples are in Figure 2.  In these plots, the LOWESS smoother is effectively calculating a mean of the covariate level at the propensity score. I expect the means for statin initiators and non-initiators to be similar, so the smooths should be close. In the ends of the age distribution, I see some separation, which is likely to be normal tail behavior. Formal statistical tests can also be used to test covariates balance in the newly matched groups.

Figure 2: LOWESS smooth of covariate balance for systolic blood pressure (left) and age (right) across statin initiators and non-initiator groups (matched data)

Please see my website for additional info about my work.

Exit interview: CHOICE alumna Elisabeth Vodicka

Editor’s note: This is the first in an ongoing series of interviews we’ve planned for the students graduating from the CHOICE Institute where we’ll get their thoughts on their grad school and dissertation experiences.

This first interview is with Elisabeth Vodicka, who defended her dissertation, “Cervical Cancer in Low-Income Settings: Costs and Cost-Effectiveness of Screening and Treatment,” on January 24th, 2018.

  • What’s your dissertation about?

evodickaMy dissertation focused on the economics of integrating cervical cancer screening and treatment into existing health systems in East Africa (Kenya and Uganda). Although cervical cancer is preventable and treatable if detected early, screening rates are low in the region (3-20% depending on regional characteristics). One strategy for improving access to potentially life-saving screening is to leverage the fact that women engage with various health systems platforms for other types of care, like for family planning, taking their children in for vaccinations, tuberculosis and HIV-treatment, etc. Since these programs are already funded and staffed, screening could be offered to women in these settings via service integration for potentially low marginal costs.

To understand the economic impact of offering cervical cancer screening and treatment to women when and where they are already engaging with the health system, I conducted costing, cost-effectiveness and budget impact analyses of integrating screening into two health care settings. First, I collected primary data and conducted a micro-costing analysis to determine direct medical, non-medical and indirect costs associated with integrating screening services into an HIV-treatment center in Kenya. For my subsequent aims, I conducted economic evaluations evaluating the potential value created in terms of cost per life year saved and budget impact to the Ministry of Health of integrating screening and treatment into HIV-treatment centers in Kenya and routine childhood immunization clinics in Uganda.

  • How did you arrive at that topic? When did you know that this is what you wanted to study?

I have long been passionate about women’s health issues and improving access to care in low-resource settings. During my second year in the program, I was offered an opportunity through the University of Washington’s Treatment, Research, and Expert Education program to conduct a micro-costing study to identify the costs associated with providing cervical cancer screening to women attending receiving HIV treatment at Coptic Hope Center for Infectious Diseases in Nairobi, Kenya. This opportunity presented a perfect overlap of my interests in women’s health, access, and health economics methods. After conducting the primary data collection, I began exploring the possibility to continue this line of research through my dissertation.

  • What was your daily schedule like when you were working on your dissertation?

To be honest, I can’t say that I had a daily schedule that was consistent over the years of working on my dissertation. While I developed my short and long proposals, I was still in classes, working as an RA, and doing consulting work on the side. These commitments often dictated the time I had to focus on preparing my dissertation proposals, which I often worked on late at night. Once I passed my general exam, my time was much more flexible. I tend to be most productive at night, so took advantage of time flexibility to make the most of my productive hours. Often this meant exercising and taking care of other tasks during the day and then leveraging my peak brain time to work on my dissertation in the evenings.

Additionally, creating regular social opportunities and accountability for my dissertation progress were key success strategies for me. My cohort and I created a dissertation writing group that met weekly to create accountability. Toward the final months of the dissertation (crunch time!), I joined a co-working space, used an online co-working app, and recruited friends and family to work together virtually and in-person to maximize accountability and meet my goals for each dissertation aim.

  • If you are willing to share, in what quarter did you submit your final short proposal and in what quarter did you graduate/defend? What were some factors that determined your dissertation timeline? 

I submitted my short proposal in Fall Quarter 2015, and I defended in Winter Quarter 2018. Together, my chair and I developed a timeline that mapped out each stage of the dissertation process from the short proposal to final defense.

  • How did you fund your dissertation?

Ongoing funding was received through work as an RA and TA. The TREE program generously supported my in-country work in Kenya. I also received additional financial and travel support through internal funding within CHOICE (e.g., Reducing Barriers for the Ambitious Fund, Rubenstein Endowment, etc.).

  • What comes next for you? What have you learned about finding work after school?

Currently, I am continuing to work as a freelance consultant on projects related to expanding access to care in low- and middle-income settings. This allows me the time and flexibility to target my employment search within groups that are an excellent fit – both organizationally and culturally – for my research interests and professional goals.

In terms of finding employment after school, the most important lesson that I have learned is to start early and network broadly. During my first year in the program, I set a goal to reach out to one new person in the field every month working on topics or in organizations that interested me. Over time and many networking coffees later, I learned about the types of organizations that might be a good fit for my interests, work style and personality, and developed positive relationships with other like-minded individuals.

CHOICE Institute Director Discusses Amazon Health Care Announcement

You may have heard the big news that came out of Seattle recently: Amazon is partnering with Berkshire Hathaway and JPMorgan Chase to address health care costs and quality by creating an independent health care company for their employees. Further details of their plan remain a secret to the general public, and the companies are likely still working out logistics amongst themselves. Given the 1.2 million employees involved in the three companies, however, many in the health care industry are thinking through the likely impact of this new partnership.


Director of the CHOICE Institute and professor of health economics at the University of Washington, Anirban Basu was recently referenced in two regional blogs describing the potential significance of the proposed plan:

According to Anirban Basu, a health care economist at the University of Washington, the trio could do a number of things to reform the health care system just by their sheer size and power alone. While most small and individual health care buyers have little power when it comes to directly negotiating with either health care providers or pharmaceutical companies, this partnership could change that—at least for those who qualify for it. Currently, price negotiating falls on third-party pharmacy benefit managers, at a cost then passed on to consumers.

Besides taking on bargaining power, Basu says Amazon may even open primary care clinics for their employees, but this could expand beyond their base.

It is important to note that while the new health plan may eventually have industry-wide effects, its scope will be limited to the companies’ employees at the beginning. And it is hardly a new phenomenon for employer groups to choose self-insurance as a means to control costs.

Henry Ford was one of the first industry giants to start his own health care insurance and delivery system in 1915, and America’s largest managed care organization, Kaiser Permanente, originally started as a health care program for employees of the Kaiser steel mills and shipyards.

Another important item to note is that America’s health care system has already been undergoing fundamental changes. While the United States Congress remains divided about how to move forward with the Affordable Care Act and improve the nation’s health care system overall, private health care companies are making their own moves. Hospital and insurance markets are becoming increasingly consolidated (with less competition to control prices), and some health care stakeholders are partnering and consolidating in innovative ways to capture market share (for example, the pharmacy company CVS Health just bought insurance company Aetna in January 2018).

Amazon’s new health care company could simply be joining these trends: historic trends of self-insuring companies to cut costs or newer trends of consolidating aspects of American health care for increased market power. However, it is entirely conceivable that the potent combination of Amazon (a technology industry giant), JPMorgan Chase (a banking industry giant), and Berkshire Hathaway (an investment giant) will bring something new to the table. Vox and StaTECHery are among many media outlets offering interesting predictions.

After the announcement, stock prices for major health care industries (e.g., Anthem, UnitedHealth, CVS, and Walgreens) experienced a sell-off as investors worry about the implications. However, experts believe that the current market would weather the storm due to the massive operational costs necessary for the partnership to enter the health care market. Moreover, the scale of Amazon, Berkshire Hathaway, and JPMorgan Chase will not be enough to compete with larger health care industry giants that already have purchasing power.

Will this health care partnership be a game changer? Perhaps, perhaps not. But as health care economists and health policy enthusiasts, students at the CHOICE Institute will certainly be watching our neighbors with interest.

[Written with the assistance of Mark Bounthavong and Nathaniel Hendrix.]

Statements of purpose: a view from the admissions committee

I recently had the opportunity to serve on a graduate program admissions committee and had a few reflections on the application process that I wanted to share. In particular, I wanted to give some advice to applicants on what makes a good statement of purpose and how you can use that document to put your best foot forward in the application process.

Graduate programs in health economics and outcomes research (HEOR) recruit from a number of undergraduate and master’s-level fields. This intersection of disciplines, in addition to the lack of undergraduate programs specific to HEOR, creates a unique set of considerations that applicants need to keep in mind when writing their statements of purpose. Here are a few brief tips on writing a successful statement of purpose in your application to HEOR programs:

Tip 1. Establish your connection to the field

First, it needs to be abundantly clear to the reader that the applicant understands what HEOR is. While some students – such as those with a previous master’s degree – may have been able to engage directly with the field, others have to demonstrate their familiarity more explicitly. This advice is especially applicable to clinicians, who bring an invaluable perspective to HEOR studies but are rarely able to participate in research during their clinical training.

Writers can demonstrate an understanding of HEOR in their statements of purpose by talking about specific classes they have taken or mentors who have talked to them about the field. It’s also vital for applicants to demonstrate an understanding of current issues in the field. Keeping up with HEOR blogs and journals is a great way to gain this understanding! The Academic Health Economists’ Blog and Healthcare Economist are among my favorites.

Tip 2. Clarify your goals

Next, the admissions committee will be curious of what your goals are and how a graduate degree in HEOR might help get you there. This means writing about what you plan to do after graduate school. Nobody is going to hold you accountable to what you say in your statement of purpose, but discussing goals demonstrates to the admissions committee that you have reflected on your future and that you can be somewhat self-directed should you enter the program. Graduate school should not be a way of postponing your entry into the “real world.”

Tip 3. Demonstrate your knowledge of the specific program

It’s important to demonstrate abundantly that you have researched your target programs thoroughly. Especially for smaller programs, it’s helpful to know how you heard about the program. If a mentor suggested you apply, mention this! With the HEOR community being relatively small, there’s a good chance that someone on the admissions committee will know your mentor, which is likely to work in your favor.

Tip 4. Polish your writing

Graduate training is a massive investment for both the school and the student. When making decisions about who to admit, the admissions committee uses the statement of purpose to assess your writing skills, the rationale for your decision to apply to their program, and your suitability for a career in HEOR. A well-written statement can pique the interest of everyone on the committee, while a poorly written one can make a candidate seem less interesting, even when every other part of their application looks ideal. Polish your statement into a masterpiece that showcases your enthusiasm for the field and you’ll be a step ahead in the admissions process.

Value in Health Care in 2018: A Broader Approach with New Research Needs


uw by drone
Photo of the University of Washington campus by Ben Babusis

Meng Li


In 1950, national health expenditure as a share of gross domestic product in the United States was 5%. The share rose to 18% in 2016. Such rapid growth of healthcare expenditure has spurred widespread interest in value assessments of medical technologies, in general, and of new medicines, in particular. The American College of Cardiology-American Heart Association (ACC-AHA), American Society of Clinical Oncology (ASCO), Institute for Clinical and Economic Review (ICER), Memorial Sloan Kettering Cancer Center, and National Comprehensive Cancer Network (NCCN) are among the organizations that have proposed and implemented new value assessment frameworks. Given their different perspectives and decision-making needs, these organizations have identified a wide range of different factors underlying value, to name a few:  clinical benefit vs. risks, magnitude of net benefit, precision of estimate, cost-effectiveness, budget impact, affordability, novelty, research and development cost, rarity, and population health burden. Most of these existing frameworks, however, lack a consistent theoretical foundation in health economics, which has led to omission of important components of benefits or costs given their perspectives.

To inform the shift towards a more value-based healthcare system in the US, the International Society for Pharmacoeconomics and Outcomes Research (ISPOR, https://www.ispor.org) started the Initiative on US Value Assessment Frameworks in 2016. The objectives of the Initiative are to describe the conceptual basis for value, examine existing value frameworks, identify novel elements of value, and recommend good practice in value assessment. In contrast with many existing frameworks, the approach to value taken by the Initiative is grounded in health economics, yet recognizes special circumstances when a microeconomic approach cannot easily accommodate all relevant elements of value.

In the final report by the Initiative, to be published in February, many novel elements of value are identified and defined: productivity, adherence-improving factors, reduction in uncertainty, fear of contagion, insurance value, severity of disease, value of hope, real option value, equity, and scientific spillovers, along with the conventional quality-adjusted life-years (QALY) gained and net costs. Among these conventional and novel elements, QALY gains, net costs, and adherence-improving factors are value concepts from the health system perspective, while the rest are from the broader societal perspective. The Initiative also identified a number of additional value elements that are currently impractical to estimate due to lack of data or appropriate methodologies, such as fit with existing programs or infrastructure, end-of-life alternatives, ethical considerations related to manipulation of genetic materials, fears associated with specific types of therapies, etc.

Following the Second Panel on Cost-Effectiveness in Health and Medicine, whose report was published in the Fall of 2016, the Initiative recommended using cost-per-QALY-gained as the starting point for decision-making for healthcare resource allocation, and that other novel elements of value should be considered when relevant and if practical. The Initiative proposed the concept of “augmented cost-effectiveness analysis (CEA)”, where more measures of value besides health gains, societal costs, and financial risk protection are considered in a value assessment. Their report outlines several potential approaches to incorporating these novel elements into value assessments. One is to incorporate them either in the cost or in the QALY in the cost-per-QALY-gained ratio. Another approach is to monetize all health and related benefits, therefore converting the CEA into a cost-benefit analysis. A third approach is to compare element-by-element, which is in line with the “Impact Inventory,” a framework for considering consequences of an intervention as they impact different sectors put forth by the Second Panel. Finally, a fourth approach is to use multi-criteria decision analysis. Each of these approaches has its own advantages and methodological challenges, and more research and experience are needed in the application of augmented CEA.

Authors of the Initiative report also address issues around budgets and thresholds. The general recommendation is to base reimbursement policies on what is good value for money given a health plan’s budget. Good value for money can be achieved by using an explicit cost-per-QALY-gained threshold along with modifiers to that threshold. Both thresholds and modifiers for public and private health plans should reflect plan members’ or taxpayers’ preferences. The novel elements of value mentioned above are candidates for modifiers to the thresholds.

In contrast to other highly industrialized nations, stakeholders in the US have not fully embraced using cost-effectiveness to inform resource allocation decisions in healthcare. The US has the highest per capita spending on healthcare in the world, while life expectancy of Americans is lagging behind 30 or so countries and has even declined for the past two years. Many now believe that this is partly due to a misalignment between value and payment, and realigning the two is critical to bending the healthcare cost curve. It is encouraging to see new value assessment frameworks for medical technologies gaining visibility and some traction in the US. In order for them to have a greater impact on reimbursement decisions, the research community needs to make sure these frameworks are conceptually and methodologically sound and the assessment processes are transparent. The ISPOR Initiative has broadened our view of what constitutes value in health care; more research is needed to understand and estimate novel elements of value in different healthcare decision contexts and their implications.

Book Review

Difficult Choices Between What is Best for Science or Best for Our Career


RIGOR MORTISHow Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions
By Richard Harris
288 pages, Perseus Books Group, List Price $28

In “Rigor Mortis: How Sloppy ScienceCreates Worthless Cures, Crushes Hope, and Wastes Billions,” Richard Harris provides compelling evidence through a series of stories and statistics that medical research is plagued by unnecessary errors despite our technology, effort, money, and passion to make a positive impact. This review takes the perspective of a graduate student in health sciences with the aim of assessing the value of Rigor Mortis for the next generation of scientists.  While the book focuses more on sloppy biological science, his concerns are equally valid in the areas of data science and disease modeling.

Richard Harris, a journalist at NPR who writes about science, has started an important conversation about the broad impact of our current scientific culture: we are publishing too many scientific studies which may have false or unreproducible results. Graduate students in health science research or related fields should not be surprised by Harris’s premise. The pressure to produce a large quantity of publications, instead of fewer and higher quality papers, weighs on every grad student in the world.

In 2017, the CHOICE Institute asked its members to read Rigor Mortis and discuss its implications for our field. One emerging theme was that trainees need to be able to report unethical behaviors without fearing adverse consequences. While required annual courses from the University of Washington Biomedical Research Integrity Series challenge students to reconsider their own personal conflicts of interest in publishing research, this remains a difficult ideal to implement in the face of other pressures. Around the lunch table in our grad student lounge, the book sparked an uncomfortable conversation about multiple testing during regression model fitting, and the long stretch of grey area between a dusty, pre-specified analysis plan and our shiny, new hypothesis-generating exploratory findings.

Harris’s storytelling reminded me of a book I love by David Quammen called “Spillover.” Both Rigor Mortis and Spillover are written by distinguished journalists about very complicated and technical problems. Using New York Times reader-friendly language,  both authors include conversations with scientists from around the world to share their stories so that the layperson can understand.

Both books highlight a common dilemma in academia: Should I do what is best for science or what is best for my career? Further, is this an incentives problem or a system problem? The current structure and business of research guide us to make choices that will enhance our career, while science is still often perceived as an altruistic pursuit for the greater good. The book offers a challenge to academic researchers: who among us can claim “no conflict of interest”?

“Attending the panel that rejected his paper proposal, the grad student inwardly trashes each presenter’s research.” — Lego Grad Student

Applying the book’s messages to health economics and outcomes research

I experienced this dilemma when deciding whether to share my HIV disease model. Scientific knowledge and methodology should be completely transparent, yet software coding to implement these techniques is intellectual property that we should not necessarily give away for free. My dilemma isn’t unique. Disease modelers everywhere struggle with this question: should we post our Excel spreadsheet or R code online for others to review and validate and risk having our discovery poached?

This is just one example of tension Harris highlights in his book, and why it is so complex to change our current scientific culture. Scientific advancement is ideally a collective good, but individuals will always need personal incentives to innovate.

Key book takeaways for young scientists

  1. Use valid ingredients
  2. Show your work
  3. No HARKing (Hypothesizing After the Results of the study are Known)
  4. Don’t jump to conclusions (and discourage others from doing this with your results)
  5. Be tough. People may try to discredit you if your hypothesis goes against their life’s work, or for any number of reasons.
  6. Be confident in your science.
  7. Recognize the tension between your own achievement and communal scientific advancement

Further discussions for fixing a broken system

  1. If money is being wasted in biomedical science and research, how do we fix the system to save money without sacrificing incentives to produce valuable innovations? One of our CHOICE Institute graduates, Carrie Bennette, asked this very question in cancer research and you can read about her findings here.
  2. Incentives need to be changed. Academic promotions should not be dependent on the number of our publications but the quality and impact of our contributions. Can we change the culture obsessed with impact factor and promote alternatives such as the H-index or Google Scholar metrics?
  3. Academic tenure systems are antiquated. How do we balance the trade-offs between hiring a post-doc or hiring a permanent staff scientist? Post-doc positions train the next generation and are cheap, however they result in workflow discontinuity from frequent turnover. Permanent staff scientists are trained to stay for a longer period of time, but would disrupt an engrained academic pipeline.


I think all students in any science-related field would benefit from reading this book. Cultural and systematic change will happen faster when we have uncomfortable conversations at the table with our colleagues and mentors. Additionally, we need to take the baton Richard Harris has passed us and start running with our generation of colleagues toward finding and implementing solutions. As our influence in our respective fields grows, so too does our responsibility.

Rigor Mortis is available in hardcover on Amazon for $18.65 and Audible for $19.95.