FILE- In this file photo dated Friday, April 17, 2015, a national library employee shows the gold Nobel Prize medal awarded to the late novelist Gabriel Garcia Marquez, in Bogota, Colombia. There is no bigger international honor than the Nobel Prize, created by 19th-century Swedish industrialist Alfred Nobel, and the 2016 laureates will be named over the coming days to join the pantheon of greats who were honored in years gone by. (AP Photo/Fernando Vergara, FILE)ASSOCIATED PRESS

Patrick Collison and Michael Nielsen have an article in The Atlantic with the attention-grabbing headline Science Is Getting Less Bang For Its Buck, which prompted a fair amount of discussion on social media. It’s a clear improvement over a lot of lost-Golden-Age narratives in that they make an effort to quantify the fall from the past, but I still found it unconvincing.

ARTICLE CONTINUES AFTER ADVERTISEMENT

Their most original contribution to the genre is a survey that attempts to quantify the decline in the importance of science using Nobel prizes as a proxy. They surveyed a large number of scientists, asking them to rate the relative importance of Nobel-winning discoveries from two different decades, and find a slight tendency to rank work from the early part of the 20th century more highly than more recent discoveries. This, they argue, is a sign that we’re not making discoveries with the same fundamental importance today as we were back in the day.

There are two problems with this, the first being that there’s really only a plausible downward trend for physics (the importance-vs-time graphs for the Chemistry and Medicine prizes are pretty flat), and it’s not that impressive even for social-science results. More importantly, though, the arguable peak in importance comes in the 1920’s and 1930’s, during the development of quantum mechanics. I don’t think the revolutionary progress of that era is something we could reasonably expect to be sustainable– that was a sui generis moment in physics, and using it as a starting point skews things in a way that’s not really appropriate. It’s sort of forced on them because the Nobel Prizes don’t go back all that far, but using that measure means they’re inadvertently using a classic “How-to-Lie-with-Statistics” trick.

They have a couple of other arguments that I found kind of weak as well, including a reference to the increase in co-authorship of papers:

[S]cientific collaborations now often involve far more people than they did a century ago. When Ernest Rutherford discovered the nucleus of the atom in 1911, he published it in a paper with just a single author: himself. By contrast, the two 2012 papers announcing the discovery of the Higgs particle had roughly a thousand authors each.

Again this is using a major outlier, this time on the modern end– the LHC papers have outsized author lists because the LHC is an enormous undertaking, and really the only game in town for work at the high-energy frontier.

USSR postage stamp ”Ernest Rutherford. Scheme of diffusion of alpha particles (Rutherford experience)’1971 year. Ernest Rutherford, 1st Baron Rutherford of Nelson, (1871 – 1937) was a New Zealand-born British chemist and physicist who became known as the father of nuclear physics..Getty

Even a more reasonable measure– median number of authors for papers posted to the arxiv, or some such– would run afoul of changing norms, though. Rutherford’s papers were technically single-author works because that was the standard at the time. Having looked at a lot of early-20th-century physics papers over the last several years, though, when you dig into these older experiments with one or two authors, they usually turn out to have a lot more people involved– they’ll have postscripts or footnotes that thank a number of technicians and assistants. In the intervening century, we’ve decided that the contributions of those people deserve recognition, so if modern standards were applied, most of those papers would have several co-authors. The historical standard under-counts the number of people actually involved in the work, in a way that makes the expansion look worse than it is.

ARTICLE CONTINUES AFTER ADVERTISEMENT

My primary complaint with the article, though, is that the reduction in “bang for the buck” in physics in particular seems to me to be less an indicator of a troubling stagnation than a sign of success. That is, a slowing in the rate of discoveries of fundamental importance, and in increase in the cost of those discoveries, is exactly what we ought to expect from science functioning as it should.

Obviously, this pre-supposes a particular model of the proper functioning of science. What I have in mind when I say that is the idea of science as a process converging on an ever-more-accurate representation of reality. Initial discoveries are relatively easy to make and contribute to our understanding on a relatively coarse scale, while each successive generation fills in finer details but at greater cost.

In this Wednesday, Nov. 25, 2009 photo, a librarian looks at the original scientific paper by Isaac Newton with details and a drawing of his reflective telescope from 1672, at the library of Britain’s Royal Society, in central London. Dozens of epoch-changing moments are preserved in the library of Britain’s Royal Society, an academy of scientists founded in 1660 to gather, discuss and spread scientific knowledge, a role it still fills today. Its members, dedicated to discovery through observation and experiment, form a roll-call of scientific fame: Isaac Newton, Benjamin Franklin, Charles Darwin, Stephen Hawking. All contributed scientific papers that together recount what geneticist Alec Jeffreys, the father of DNA fingerprinting and a current member of the society calls “this amazing journey over the past 350 years.” (AP Photo/Lefteris Pitarakis)ASSOCIATED PRESS

We can see this sort of progression toward finer detail and higher prices in the long history of physics: the field was launched by discoveries in classical mechanics involving big objects and large forces– the motion of the planets, the behavior of objects in free fall. These are common part of everyday experience, so the cost is relatively small, and the increase in our knowledge is rapid and dramatic.

ARTICLE CONTINUES AFTER ADVERTISEMENT

As mechanics became well developed, physicists began to turn to less common, less significant interactions: electric and magnetic forces. These are incredibly important on a fundamental level, but much harder to see in ordinary circumstances– creating electrostatic forces and magnets in a way that allows controlled investigations of their interactions was not trivial, particularly in the eighteenth and early nineteenth centuries as this stuff started to take off. The increase in our knowledge is still large, but the cost is considerably greater.

As time goes on, you get to quantum mechanics, which involves even tinier and more exotic physics, requiring even greater sophistication to carry out the necessary measurements. Quantum effects are essential for understanding our world– I’ve got a whole book on this— but you could be forgiving for not knowing that, because they’re only apparent after a bit of digging. And that digging costs money– by the early 1900’s, physics had moved almost entirely into formal, institutional settings as the resources needed to make progress started to exceed the capabilities of even idle aristocrats.

This continues on through the dawn of nuclear physics, and then into the era of ever-larger accelerators. The effects being studied become more and more subtle, and the experiments needed to study them become more and more expensive. At each step of the process, we’re filling in finer and finer details of our picture of the universe, and it take more money and effort to nail down those details.

Undated picture of English chemist and physicist Michael Faraday. (AP Photo)ASSOCIATED PRESS

ARTICLE CONTINUES AFTER ADVERTISEMENT

Now, this is not to say that these details aren’t important. From the perspective of technology and economics, the electromagnetic and quantum revolutions are vastly more important than the Newtonian ones, because most of our modern technology uses electric current to power transistor-based processors that rely on the quantum nature of semiconductors. If you think about it in terms of an improving approximation of reality, though, the scale at which we’re working is getting smaller and less obvious all the time, and that’s a good thing. If we were still making cheap and easy discoveries at the everyday scale of Newtonian physics, something would be horribly wrong with our model.

Now, there is a problem here from the bang-for-buck perspective, in that the remaining fundamental mysteries in physics– beyond-Standard Model particles and quantum gravity– seem highly unlikely to provide the basis for new technologies in the way that electromagnetism and quantum mechanics did. I guess there’s a possibility that an experimental solution to the problem of dark matter might lead to some trillion-dollar method for extracting energy from the dark sector, but that seems pretty remote. I don’t think that represents any failure on the part of science as a whole, though. Instead, it’s an indication of just how well we’ve succeeded

And, it should be noted, a slowing in the rate of fundamental breakthroughs does not necessarily portend the end of practical progress. There’s still plenty of room to push the limits of science that we understand pretty well already– new advances in materials and technologies based on the quantum ideas discovered in the 1930’s. And there’s arguably a lot more room for breakthroughs on the life-science side of things. We’re not going to run out of science to do anytime soon, even if it becomes harder and costlier to make new breakthroughs at the most finely detailed scales.

” readability=”152.58879162894″>
< div _ ngcontent-c14 =" " innerhtml ="

FILE- In this file image dated Friday, April 17, 2015, a nationwide library worker reveals the gold Nobel Reward medal granted to the late author Gabriel Garcia Marquez, in Bogota, Colombia. There is no larger worldwide honor than the Nobel Reward, developed by 19 th-century Swedish industrialist Alfred Nobel, and the 2016 laureates will be called over the coming days to sign up with the pantheon of greats who were honored in years passed. (AP Photo/Fernando Vergara, FILE) ASSOCIATED PRESS

Patrick Collison and Michael Nielsen have a short article in The Atlantic with the eye-catching heading Science Is Getting Less Bang For Its Dollar, which triggered a reasonable quantity of conversation on social networks. It’s a clear enhancement over a great deal of lost-Golden-Age stories because they make an effort to measure the fall from the past, however I still discovered it unconvincing.

(******************** ) POST CONTINUES AFTER AD

Their most initial

contribution to the category is a study that tries to measure the decrease in the significance of science utilizing Nobel rewards as a proxy. They surveyed a a great deal of researchers, asking to rank the relative significance of Nobel-winning discoveries from 2 various years, and discover a small propensity to rank work from the early part of the20 th century more extremely than more current discoveries. This, they argue, is an indication that we’re not making discoveries with the very same basic significance today as we were back then.

(*************** )There are 2 issues with this, the very first being that there’s truly just a possible down pattern for physics( the importance-vs-time charts for the Chemistry and Medication rewards are quite flat), and it’s not that remarkable even for social-science outcomes. More significantly, however, the feasible peak in significance can be found in the1920’s and 1930’s, throughout the advancement of quantum mechanics. I do not believe the innovative development of that age is something we might fairly anticipate to be sustainable– that was a sui generis(***************** )minute in physics, and utilizing it as a beginning point alters things in such a way that’s not truly proper. It’s sort of required on them due to the fact that the Nobel Prizes do not return all that far, however utilizing that procedure implies they’re accidentally utilizing a timeless” How-to-Lie-with-Statistics “technique.

They have a number of other arguments that I discovered type of weak too, consisting of

a referral to the boost in co-authorship of documents:(************* )

[S] cientific cooperations now typically include much more individuals than they did

a century back. When Ernest Rutherford found the nucleus of the atom in(**************************************************************** ), he released it in a paper with simply a single author: himself. By contrast, the 2 2012 documents revealing the discovery of the Higgs particle had approximately a thousand authors each.

Once again this is utilizing a significant outlier, this time on the contemporary end– the LHC documents have outsized author lists due to the fact that the LHC is

a huge endeavor, and truly the only video game in the area for work at the high-energy frontier.

(************************* )(****** )

(*************************** )(***** )

USSR

postage stamp
” Ernest Rutherford. Plan of diffusion of alpha
particles( Rutherford experience)’
(************************************************************ )year. Ernest Rutherford, 1st Baron Rutherford of Nelson,((****************************************************************** )–1937) was a New Zealand-born British chemist and physicist who ended up being referred to as the daddy of nuclear physics. Getty

(****** )

(*************** )Even a more sensible procedure– average variety of authors for documents published to the arxiv, or some such– would run
afoul of altering standards,

however.

Rutherford’s documents were technically single-author works since that was the requirement at the time. Having actually taken a look at a lot of early-(********************************************************************************** )th-century physics documents over the last numerous years, however, when you go into these older try outs a couple of authors, they normally end up to have a lot more individuals included– they’ll have postscripts or footnotes that thank a variety of specialists and assistants. In the stepping in century, we have actually chosen that the contributions of those individuals should have acknowledgment, so if contemporary requirements were used, the majority of those documents would have numerous co-authors. The historic basic under-counts the variety of individuals in fact associated with the work, in such a way that makes the growth appearance even worse than it is.(************* )

POST CONTINUES AFTER AD(****** )

My main grievance with the post, however, is that the decrease in” bang for the dollar “in physics in specific appears to me to be less an indication of an uncomfortable stagnancy than an indication of success. That is, a slowing in the rate of discoveries of basic significance, and in boost in the expense of those discoveries, is precisely what we should get out of science working as it should.

Undoubtedly, this pre-supposes a specific design of the correct performance of science. What I want when I state that is the concept of science as a procedure assembling on an ever-more-accurate representation of truth. Preliminary discoveries are reasonably simple to make and add to our
understanding on a reasonably coarse scale, while each succeeding generation completes finer information however at higher expense.(************* )

(******** )(********** )In this Wednesday, Nov.25,2009 image, a curator takes a look at the initial clinical paper by

Isaac Newton with information and an illustration of his reflective telescope from1672, at the library of Britain’s Royal Society, in main London. Lots of epoch-changing minutes are protected in the library of Britain’s Royal Society, an academy of researchers established in 1660 to collect, talk about and spread out clinical understanding, a function it still fills today. Its members, committed to discovery through observation and experiment, form a roll-call of clinical popularity: Isaac Newton, Benjamin Franklin, Charles Darwin, Stephen Hawking. All contributed clinical documents that together state what geneticist Alec Jeffreys, the daddy of DNA fingerprinting and a present member of the society calls” this remarkable journey over the past (**************************************************************************** )years.”( AP Photo/Lefteris Pitarakis) ASSOCIATED PRESS

We can see this sort of development towards finer information and greater costs in the long history of physics: the field was introduced by discoveries in classical mechanics including huge things and big forces– the movement of the worlds, the habits of things in complimentary fall. These prevail part of daily experience, so the expense is reasonably little, and the boost in our understanding is fast and remarkable.

(********************

) POST CONTINUES AFTER AD (********************** )

(*************** )As mechanics ended up being well established, physicists started to rely on less typical, less substantial interactions: electrical and magnetic forces. These are exceptionally crucial on an essential level, however much more difficult to see in regular scenarios– developing electrostatic forces and magnets in such a way that permits regulated examinations of their interactions was not unimportant, especially in the eighteenth and early 19th centuries as this things began to remove.

The boost in our

understanding is still big, however the expense is significantly higher.

(*************** )As time goes on, you get to quantum mechanics, which includes even tinier and more unique physics, needing even higher elegance to perform the essential measurements. Quantum impacts are vital for comprehending our world– I have actually got a entire book on this— however you might be forgiving for not understanding that, due to the fact that they’re just evident after a little bit of digging. Which digging costs cash– by the early 1900’s, physics had actually moved nearly totally into official, institutional settings as the resources required to make development began to go beyond the abilities of even idle aristocrats.

This continues through the dawn of nuclear physics, and after that into the age of ever-larger accelerators. The impacts being studied ended up being increasingly more subtle, and the experiments required to study them end up being increasingly more pricey. At each action of the procedure, we’re filling out finer and finer information of our image of deep space, and it take more cash and effort to pin down those information.

(******************************** )

Undated image of English chemist and physicist Michael Faraday. (AP Picture) ASSOCIATED PRESS

(************** )

(****** )(****** )

POST CONTINUES AFTER AD(********************** )

(*************** )Now, this is not to state that these information aren’t crucial From the viewpoint of innovation and economics, the electro-magnetic and quantum transformations are significantly more crucial than the Newtonian ones, due to the fact that the majority of our contemporary innovation utilizes electrical present to power transistor-based processors that depend on the quantum nature of
semiconductors.

If you believe

about it in regards to an
enhancing approximation of truth, however, the scale at which we’re working is getting smaller sized and

less apparent all the

time, and that’s a good idea If we were still making low-cost and simple discoveries at the daily scale of Newtonian physics, something would be badly incorrect with our design.

Now, there is an issue here from the bang-for-buck viewpoint, because the staying basic secrets in physics– beyond-Standard Design particles and quantum gravity– appear extremely not likely to offer the basis for brand-new innovations in the manner in which electromagnetism and quantum mechanics did. I think there’s a possibility that a speculative service to the issue of dark matter may result in some trillion-dollar approach for drawing out energy from the dark sector, however that appears quite remote. I do not believe that represents any failure on the part of science as a whole, however. Rather, it’s an indicator of simply how well we have actually prospered

(*************** )And, it needs to be kept in mind, a slowing in the rate of basic advancements does not always hint completion of useful development. There’s still a lot of space to press the limitations of science that we comprehend quite well currently– brand-new advances in products and innovations based upon the quantum concepts found in the 1930’s. And there’s probably a lot more space for advancements on the life-science side of things. We’re not going to lack science to do anytime quickly, even if it ends up being more difficult and more expensive to make brand-new advancements at the most carefully in-depth scales.

” readability =”15258879162894″ > (** ).

(********************************* ).

FILE -In this file image dated Friday, April17,(********************************************************* ), a nationwide library worker reveals the gold Nobel Reward medal granted to the late author Gabriel Garcia Marquez, in Bogota, Colombia. There is no larger worldwide honor than the Nobel Reward, developed by19 th-century Swedish industrialist Alfred Nobel, and the 2016 laureates will be called over the coming days to sign up with the pantheon of greats who were honored in years passed.( AP Photo/Fernando Vergara, FILE) ASSOCIATED PRESS

.

(****** ).(*********************************** )Patrick Collison and Michael Nielsen have a short article in The Atlantic with the eye-catching heading(****************** )Science Is Getting Less Bang For Its Dollar(******************* ), which triggered a reasonable quantity of conversation on social networks. It’s a clear enhancement over a great deal of lost-Golden-Age stories because they make an effort to measure the fall from the past, however I still discovered it unconvincing.

(*************** ). POST CONTINUES AFTER AD(********************** ).

Their most initial contribution to the category is a study that tries to measure the decrease in the significance of science utilizing Nobel rewards as a proxy. They surveyed a a great deal of researchers, asking to rank the relative significance of Nobel-winning discoveries from 2 various years, and discover a small propensity to rank work from the early part of the20 th century more extremely than more current discoveries.
This, they argue, is an indication that we’re not making discoveries with the very same basic significance today as we were back then.

There are 2 issues with this, the very first being that there’s truly just a possible down pattern for physics (the importance-vs-time charts for the Chemistry and Medication rewards are quite flat), and it’s not that remarkable even for social-science outcomes. More significantly, however, the feasible peak in significance can be found in the 1920’s and 1930’s, throughout the advancement of quantum mechanics. I do not believe the innovative development of that age is something we might fairly anticipate to be sustainable– that was a sui generis minute in physics, and utilizing it as a beginning point alters things in such a way that’s not truly proper. It’s sort of required on them due to the fact that the Nobel Prizes do not return all that far, however utilizing that procedure implies they’re accidentally utilizing a timeless “How-to-Lie-with-Statistics” technique.

They have a number of other arguments that I discovered type of weak too, consisting of a referral to the boost in co-authorship of documents:

.

[S] cientific cooperations now typically include much more individuals than they did a century back. When Ernest Rutherford found the nucleus of the atom in 1911, he released it in a paper with simply a single author: himself. By contrast, the 2 2012 documents revealing the discovery of the Higgs particle had approximately a thousand authors each.

.

Once again this is utilizing a significant outlier, this time on the contemporary end– the LHC documents have outsized author lists due to the fact that the LHC is a huge endeavor, and truly the only video game in the area for work at the high-energy frontier.

.

.

USSR postage stamp” Ernest Rutherford. Plan of diffusion of alpha particles (Rutherford experience)’ 1971 year. Ernest Rutherford, 1st Baron Rutherford of Nelson, (1871– 1937) was a New Zealand-born British chemist and physicist who ended up being referred to as the daddy of nuclear physics. Getty

.

.

Even a more sensible procedure– average variety of authors for documents published to the arxiv, or some such– would contravene of altering standards, however. Rutherford’s documents were technically single-author works since that was the requirement at the time. Having actually taken a look at a lot of early – 20 th-century physics documents over the last numerous years, however, when you go into these older try outs a couple of authors, they normally end up to have a lot more individuals included– they’ll have postscripts or footnotes that thank a variety of specialists and assistants. In the stepping in century, we have actually chosen that the contributions of those individuals should have acknowledgment, so if contemporary requirements were used, the majority of those documents would have numerous co-authors. The historic basic under-counts the variety of individuals in fact associated with the work, in such a way that makes the growth appearance even worse than it is.

. POST CONTINUES AFTER AD

.

My main grievance with the post, however, is that the decrease in “bang for the dollar” in physics in specific appears to me to be less an indication of an uncomfortable stagnancy than an indication of success. That is, a slowing in the rate of discoveries of basic significance, and in boost in the expense of those discoveries, is precisely what we should get out of science working as it should.

Undoubtedly, this pre-supposes a specific design of the correct performance of science. What I want when I state that is the concept of science as a procedure assembling on an ever-more-accurate representation of truth. Preliminary discoveries are reasonably simple to make and add to our understanding on a reasonably coarse scale, while each succeeding generation completes finer information however at higher expense.

.

.

In this Wednesday, Nov. 25, 2009 image, a curator takes a look at the initial clinical paper by Isaac Newton with information and an illustration of his reflective telescope from 1672, at the library of Britain’s Royal Society, in main London. Lots of epoch-changing minutes are protected in the library of Britain’s Royal Society, an academy of researchers established in 1660 to collect, talk about and spread out clinical understanding, a function it still fills today. Its members, committed to discovery through observation and experiment, form a roll-call of clinical popularity: Isaac Newton, Benjamin Franklin, Charles Darwin, Stephen Hawking. All contributed clinical documents that together state what geneticist Alec Jeffreys, the daddy of DNA fingerprinting and a present member of the society calls “this remarkable journey over the past 350 years.” (AP Photo/Lefteris Pitarakis) ASSOCIATED PRESS

.

.

We can see this sort of development towards finer information and greater costs in the long history of physics: the field was introduced by discoveries in classical mechanics including huge things and big forces– the movement of the worlds, the habits of things in complimentary fall. These prevail part of daily experience, so the expense is reasonably little, and the boost in our understanding is fast and remarkable.

. POST CONTINUES AFTER AD

.

As mechanics ended up being well established, physicists started to rely on less typical, less substantial interactions: electrical and magnetic forces. These are exceptionally crucial on an essential level, however much more difficult to see in regular scenarios– developing electrostatic forces and magnets in such a way that permits regulated examinations of their interactions was not unimportant, especially in the eighteenth and early 19th centuries as this things began to remove. The boost in our understanding is still big, however the expense is significantly higher.

As time goes on, you get to quantum mechanics, which includes even tinier and more unique physics, needing even higher elegance to perform the essential measurements. Quantum impacts are vital for comprehending our world– I have actually got a entire book on this — however you might be forgiving for not understanding that, due to the fact that they’re just evident after a little bit of digging. Which digging costs cash– by the early 1900’s, physics had actually moved nearly totally into official, institutional settings as the resources required to make development began to go beyond the abilities of even idle aristocrats.

This continues through the dawn of nuclear physics, and after that into the age of ever-larger accelerators. The impacts being studied ended up being increasingly more subtle, and the experiments required to study them end up being increasingly more pricey. At each action of the procedure, we’re filling out finer and finer information of our image of deep space, and it take more cash and effort to pin down those information.

.

.

Undated image of English chemist and physicist Michael Faraday. (AP Picture) ASSOCIATED PRESS

.

.

. POST CONTINUES AFTER AD

.

Now, this is not to state that these information aren’t crucial From the viewpoint of innovation and economics, the electro-magnetic and quantum transformations are significantly more crucial than the Newtonian ones, due to the fact that the majority of our contemporary innovation utilizes electrical present to power transistor-based processors that depend on the quantum nature of semiconductors. If you think of it in regards to an enhancing approximation of truth, however, the scale at which we’re working is getting smaller sized and less apparent all the time, and that’s a good idea If we were still making low-cost and simple discoveries at the daily scale of Newtonian physics, something would be badly incorrect with our design.

Now, there is an issue here from the bang-for-buck viewpoint, because the staying basic secrets in physics– beyond-Standard Design particles and quantum gravity– appear extremely not likely to offer the basis for brand-new innovations in the manner in which electromagnetism and quantum mechanics did. I think there’s a possibility that a speculative service to the issue of dark matter may result in some trillion-dollar approach for drawing out energy from the dark sector, however that appears quite remote. I do not believe that represents any failure on the part of science as a whole , however. Rather, it’s an indicator of simply how well we have actually prospered

And, it needs to be kept in mind, a slowing in the rate of basic advancements does not always hint completion of useful development. There’s still a lot of space to press the limitations of science that we comprehend quite well currently– brand-new advances in products and innovations based upon the quantum concepts found in the 1930’s. And there’s probably a lot more space for advancements on the life-science side of things. We’re not going to lack science to do anytime quickly, even if it ends up being more difficult and more expensive to make brand-new advancements at the most carefully in-depth scales.

.