Science has come a long way in the last century. Advances in communication, changes in who funds scientific research and how, and who actually conducts the research have all changed the scientific community. Shifts in views and an increasing acceptance of people from various cultures and demographics have also propelled science into a new era.
In most cases, science is no longer misused to reinforce racial or gender stereotypes or otherwise support social injustice. Although by no means an exhaustive list, here are 10 examples illustrating how science has changed, for better or worse.
What do you think are the most significant changes — good and bad — that have taken place?
Women in Science
While there are still some areas of science that employ few women, there were far fewer female scientists 100 years ago. And, despite significant contributions by women scientists, the scientific community back then was often dismissive of their work. One example is Henrietta Leavitt Swan, who in 1912 discovered a special characteristic of Cepheid variables: that their periodicity was directly related to their brightness. This provided a way to measure the universe, and without her contributions, Edwin Hubble would not have been able to show that galaxies existed outside the Milky Way, or that the universe is expanding. Swan was not an astronomer, but a “computer” at the Harvard Observatory, where she made 25 cents an hour counting stars on photographic plates. Although gender equality continues to be a contentious issue in many areas of scientific study (just ask Lawrence Summers), Swan and women like her would have a far better chance at recognition now.
Advances in communication make science easier to practice today compared to 100 years ago. Throughout history, there are examples of independent scientists reaching the same breakthroughs, such as calculus and the theory of evolution, nearly simultaneously. While this sometimes happens today, collaborations are much easier since the advent of the Internet, as files and data can be shared almost instantaneously. And open-access publishers of scientific journals like Public Library of Science (PLoS) make peer-reviewed scientific articles available immediately after publication, free of charge. Overall, this expedited exchange of information makes the community significantly more productive.
Federal funding for scientific research didn’t exist 100 years ago; instead, it was supported through charities and wealthy philanthropists. Now, major federal agencies like the National Science Foundation (NSF) and the National Institute of Health (NIH) play a significant role in scientific research in the United States. For example, the NSF currently funds around 20 percent of basic research in academia, and nearly all research in fields like math and computer science. While federal organizations have played an important role in the production of scientific knowledge, the downside is that the government has a say in which projects get funding, and personal beliefs and politics can hinder scientific progress. Depending on who is in office, this can have deeply negative effects for specific fields, as when George W. Bush cut off federal funding for stem cell research in 2001. (Pictured: a colony of human embryonic stem cells.)
At the turn of the 20th century, the eugenics movement was gaining steam and eugenics research facilities and organizations were springing up across the U.S. This is an example of science at its worst. Although eugenics eventually paved the way for modern genetics (in fact, many famous modern genetic research centers were originally built for eugenics studies), its origins are quite sinister and had horrible repercussions. The basic idea behind eugenics was to select “desirable” human traits and characteristics and to breed people (yes, like prize animals) so that each generation would be better than the last. While the original intentions may have been good — to try to protect future generations from physical or mental disabilities, or produce a healthier and stronger population — in truth, the idea was shortsighted, elitist, and racist. After the atrocities of World War II, where Nazis were inspired by the tenets of eugenics in their slaughter of millions of Jews, the movement lost popularity in the scientific community and beyond. Good riddance. _(Pictured: Sir Francis Galton, eugenics pioneer)
While its historical roots may be troublesome, the field of genetics has revolutionized our understanding of biological processes. We now know the structure and function of molecules that hadn’t even been discovered 100 years ago. Today, we can genetically manipulate crops to produce their own pesticides, clone a growing list of animals, identify genes that may contribute to disease, and many other modern biological marvels. But debates on whether or not this is an improvement run deep. While insect-resistant plants might increase crop yields, thereby taking a step to eradicate food shortages, many are skeptical of their safety. Cloning brings up a multitude of ethical concerns. And while people can now hire a company to map their genome to find what genes may predispose them to mental and physical illness, thereby creating the possibility for individualized medical treatment, these capabilities also lead to concerns regarding privacy, not to mention the fact that gene therapy trials have led to patient deaths. So, depending on which side you take, the genetics revolution is either an improvement or a disaster.
The Commercialization of Science
The commercialization of science has perhaps never been as prevalent as it is today. Merging business and science certainly has upsides. For example, companies specializing in laboratory equipment and materials greatly expedite research. Need a specific antibody or breed of animal to conduct research? No need to generate them in the lab, which takes time and money — simply order online and have it delivered within days. But commercialization of science has also caused many companies to focus more on profit than using science to benefit humanity. For example, there are claims that pharmaceutical companies spend significantly more money on marketing than on research and development, and typically focus on blockbuster drugs that will make the most money. Diseases that don’t necessarily raise profit margins, like malaria and AIDS, get far less financial attention from the private sector.
Humans in Cages
Ota Benga is possibly the most well-known example of the scientific racism that ran rampant in the early 19th century. Just last year, his story inspired characters in two blockbuster movies (The Curious Case of Benjamin Button and The Fall). And his story is a sad one. Benga was an Mbuti — one of a group of Congolese pygmies — who was purchased from slave traders by an American named Samuel Phillips Verner. Verner took Benga and several other African men to display at the St. Louis World’s Fair in 1904, in a special exhibit of “native” people. Sadly, this was not the last of Benga’s experiences as an anthropological curiosity. Two years later, he was put on display in the monkey house at the Bronx Zoo in New York, where he was eventually housed with an orangutan. The zoo was eventually forced to release Benga, and he spent the next several years in Brooklyn and Virginia. He committed suicide in 1916.
In 1905, Albert Einstein wrote five groundbreaking papers, including his special theory of relativity. They were published in the German journal Annalen der Physik, and while they were reviewed by the editor, Max Planck, they did not go through anything similar to the peer-review process that is in place today. In fact, most scientific journals did not have peer review back then. The process has many critics, and there are instances where well-respected scientific journals have rejected important papers, including research that has gone on to win Nobel prizes. The list of groundbreaking work initially rejected by the journal Nature, for example, includes the Krebs cycle, work on photosynthesis, and Stephen Hawking’s black-hole radiation. Even Einstein’s work may not have been published under such strict guidelines. However, peer review also helps filter out work that is not up to par and also helps catch falsified results, plagiarism, and other instances of scientific misconduct.
In 1896, Swedish chemist Svante August Arrhenius published the first paper describing what is now known as the greenhouse effect, and suggested that burning fossil fuels would cause global warming. By 1910, most scientists had dismissed his predictions, and climate change didn’t get much attention again until the 1970s. Now climate change is an accepted fact amongst most scientists, we have organizations like the Intergovernmental Panel on Climate Change (IPCC) collecting data related to global warming, and there is significant research going into understanding its underlying mechanisms. Scientists across disciplines are working on ways to mitigate the negative effects of global warming. There is also a focus on making products and technologies that have minimum impact on the environment, as well as research on sustainable energy to help wean us off of fossil fuels.
Another way in which science is improving and evolving compared to 100 years ago is the way it’s studied in academia. Around 1900, American research institutions began to take their modern shape, specialized graduate programs emerged, and the natural sciences became a standard part of higher education. This paved the way for the structured scientific disciplines and research institutions that are now the cornerstone of basic research. Now, while many aspects of this system are still in place, interdisciplinary programs are evolving, which allow for cross-pollination amongst multiple scientific disciplines, as well as between science and the humanities — decidedly a good thing, since it fosters a broader understanding and helps link areas of study that previously weren’t in direct communication. (Pictured: Stanford University’s James H. Clark Center)