Quanta Magazine

Spread the love


How our brain, a three-pound mass of tissue encased within a bony skull, creates perceptions from sensations is a long-standing mystery. Abundant evidence and decades of sustained research suggest that the brain cannot simply be assembling sensory information, as though it were putting together a jigsaw puzzle, to perceive its surroundings. This is borne out by the fact that the brain can construct a scene based on the light entering our eyes, even when the incoming information is noisy and ambiguous.

Consequently, many neuroscientists are pivoting to a view of the brain as a “prediction machine.” Through predictive processing, the brain uses its prior knowledge of the world to make inferences or generate hypotheses about the causes of incoming sensory information. Those hypotheses — and not the sensory inputs themselves — give rise to perceptions in our mind’s eye. The more ambiguous the input, the greater the reliance on prior knowledge.

“The beauty of the predictive processing framework [is] that it has a really large — sometimes critics might say too large — capacity to explain a lot of different phenomena in many different systems,” said Floris de Lange, a neuroscientist at the Predictive Brain Lab of Radboud University in the Netherlands.

However, the growing neuroscientific evidence for this idea has been mainly circumstantial and is open to alternative explanations. “If you look into cognitive neuroscience and neuro-imaging in humans, [there’s] a lot of evidence — but super-implicit, indirect evidence,” said Tim Kietzmann of Radboud University, whose research lies in the interdisciplinary area of machine learning and neuroscience.

So researchers are turning to computational models to understand and test the idea of the predictive brain. Computational neuroscientists have built artificial neural networks, with designs inspired by the behavior of biological neurons, that learn to make predictions about incoming information. These models show some uncanny abilities that seem to mimic those of real brains. Some experiments with these models even hint that brains had to evolve as prediction machines to satisfy energy constraints.

And as computational models proliferate, neuroscientists studying live animals are also becoming more convinced that brains learn to infer the causes of sensory inputs. While the exact details of how the brain does this remain hazy, the broad brushstrokes are becoming clearer.

Unconscious Inferences in Perception

Predictive processing may seem at first like a counterintuitively complex mechanism for perception, but there is a long history of scientists turning to it because other explanations seemed wanting. Even a thousand years ago, the Muslim Arab astronomer and mathematician Hasan Ibn Al-Haytham highlighted a form of it in his Book of Optics to explain various aspects of vision. The idea gathered force in the 1860s, when the German physicist and physician Hermann von Helmholtz argued that the brain infers the external causes of its incoming sensory inputs rather than constructing its perceptions “bottom up” from those inputs.

Helmholtz expounded this concept of “unconscious inference” to explain bi-stable or multi-stable perception, in which an image can be perceived in more than one way. This occurs, for example, with the well-known ambiguous image that we can perceive as a duck or a rabbit: Our perception keeps flipping between the two animal images. In such cases, Helmholtz asserted that the perception must be an outcome of an unconscious process of top-down inferences about the causes of sensory data since the image that forms on the retina doesn’t change.

During the 20th century, cognitive psychologists continued to build the case that perception was a process of active construction that drew on both bottom-up sensory and top-down conceptual inputs. The effort culminated in an influential 1980 paper, “Perceptions as Hypotheses,” by the late Richard Langton Gregory, which argued that perceptual illusions are essentially the brain’s erroneous guesses about the causes of sensory impressions. Meanwhile, computer vision scientists stumbled in their efforts to use bottom-up reconstruction to enable computers to see without an internal “generative” model for reference.

“Trying to make sense of data without a generative model is doomed to failure — all one can do is make statements about patterns in data,” said Karl Friston, a computational neuroscientist at University College London.

But while acceptance of predictive processing grew, questions remained about how it might be implemented in the brain. One popular model, called predictive coding, argues for a hierarchy of information processing levels in the brain. The highest level represents the most abstract, high-level knowledge (for instance, the perception of a snake in the shadows ahead). This layer makes predictions, anticipating the neural activity of the layer below, by sending signals downward. The lower layer compares its actual activity against the prediction from above. If there’s a mismatch, the layer generates an error signal that flows upward, so that the higher layer can update its internal representations.

This process happens simultaneously for each pair of consecutive layers, all the way down to the bottommost layer, which receives actual sensory input. Any discrepancy between what’s received from the world and what’s being anticipated results in an error signal that ripples back up the hierarchy. The highest layer eventually updates its hypothesis (that it wasn’t a snake after all, just a coiled rope on the ground).

“In general, the idea of predictive coding, especially when it’s applied to the cortex, is that the brain has basically two populations of neurons,” de Lange said: one that encodes the current best prediction about what is being perceived and another that signals errors in that prediction.

In 1999, the computer scientists Rajesh Rao and Dana Ballard (then at the Salk Institute for Biological Studies and the University of Rochester, respectively) built a formidable computational model of predictive coding that had neurons explicitly for prediction and error correction. They modeled parts of a pathway in the visual processing system of primate brains that consists of hierarchically organized regions responsible for recognizing faces and objects. They showed that the model could recapitulate some unusual behaviors of the primate visual system.

This work, however, was done before the advent of modern deep neural networks, which have one input layer, one output layer and multiple hidden layers sandwiched between the two. By 2012, neuroscientists were using deep neural networks to model the primate ventral visual stream. But almost all these models were feedforward networks, in which information flows only from the input to the output. “The brain is clearly not a purely feedforward machine,” de Lange said. “There’s lots of feedback in the brain, about as much as there is feedforward [signaling].”

So neuroscientists turned to another type of model, called a recurrent neural network (RNN). These have features that make them “an ideal substrate” for modeling the brain, according to Kanaka Rajan, a computational neuroscientist and assistant professor at the Icahn School of Medicine at Mount Sinai in New York, whose lab uses RNNs to understand brain function. RNNs have both feedforward and feedback connections between their neurons, and they have constant ongoing activity that is independent of inputs. “The ability to produce these dynamics over a very long period of time, essentially forever, is what gives these networks the ability to then be trained,” said Rajan.

Prediction Is Energy-Efficient

RNNs caught the attention of William Lotter and his doctoral thesis advisers David Cox and Gabriel Kreiman at Harvard University. In 2016, the team showed off an RNN that learned to predict the next frame in a video sequence. They called it PredNet (“I’ll take blame for not having enough creativity to come up with something better,” said Lotter). The team designed the RNN in keeping with the principles of predictive coding as a hierarchy of four layers, each one predicting the input it’s anticipating from the layer below and sending an error signal upward if there’s a mismatch.

They then trained the network on videos of city streets shot from a camera mounted on a car. PredNet learned to continuously predict the next frame in a video. “We didn’t know if it would actually work,” said Lotter. “We tried it and saw it was actually making predictions. And that was pretty cool.”

The next step was to connect PredNet to neuroscience. Last year in Nature Machine Intelligence, Lotter and colleagues reported that PredNet demonstrates behaviors seen in monkey brains in response to unexpected stimuli, including some that are hard to replicate in simple feedforward networks.

“That’s fantastic work,” Kietzmann said of PredNet. But he, Marcel van Gerven and their colleagues at Radboud were after something more basic: Both the Rao and Ballard model and PredNet explicitly incorporated artificial neurons for prediction and error correction, along with mechanisms that caused correct top-down predictions to inhibit the error neurons. But what if those weren’t explicitly specified? “We wondered whether all of this ‘baking in’ architectural constraints is really needed or whether we would get away with an even simpler approach,” said Kietzmann.

What occurred to Kietzmann and van Gerven was that neural communication is energetically costly (the brain is the most energy-intensive organ in the body). A need to conserve energy might therefore constrain the behavior of any evolving neural network in organisms.

The researchers decided to see whether any of the computational mechanisms for predictive coding might emerge in RNNs that had to accomplish their tasks using as little energy as possible. They figured that the strengths of the connections, also known as weights, between the artificial neurons in their networks could serve as a proxy for synaptic transmission, which is what accounts for much of the energy usage in biological neurons. “If you reduce weights between artificial units, that means that you communicate with less energy,” said Kietzmann. “We take this as minimizing synaptic transmission.”

rnnn","settings":{"socialLinks":[{"type":"facebook","label":"Facebook","url":"https://www.facebook.com/QuantaNews","__typename":"SocialMediaLink"},{"type":"twitter","label":"Twitter","url":"https://twitter.com/QuantaMagazine","__typename":"SocialMediaLink"},{"type":"youtube","label":"YouTube","url":"http://youtube.com/c/QuantamagazineOrgNews","__typename":"SocialMediaLink"},{"type":"instagram","label":"Instagram","url":"https://instagram.com/quantamag","__typename":"SocialMediaLink"},{"type":"rss","label":"RSS","url":"https://api.quantamagazine.org/feed/","__typename":"SocialMediaLink"}],"newsletterAction":"https://quantamagazine.us1.list-manage.com/subscribe/post?u=0d6ddf7dc1a0b7297c8e06618&id=f0cb61321c","newsletterUrl":"http://us1.campaign-archive2.com/home/?u=0d6ddf7dc1a0b7297c8e06618&id=f0cb61321c","sfNotice":"An editorially independent publication supported by the Simons Foundation.","commentsHeader":"

n","itunesSubscribe":"https://itunes.apple.com/us/podcast/quanta-science-podcast/id1021340531?mt=2&ls=1","androidSubscribe":"https://podcasts.google.com/feed/aHR0cHM6Ly93d3cucXVhbnRhbWFnYXppbmUub3JnL2ZlZWQvcG9kY2FzdC8","spotifySubscribe":"https://open.spotify.com/show/7oKXOpbHzbICFUcJNbZ5wF","itunesJoyOfX":"https://podcasts.apple.com/us/podcast/the-joy-of-x/id1495067186","androidJoyOfX":"https://podcasts.google.com/feed/aHR0cHM6Ly9hcGkucXVhbnRhbWFnYXppbmUub3JnL2ZlZWQvdGhlLWpveS1vZi14Lw","spotifyJoyOfX":"https://open.spotify.com/show/5HcCtKPH5gnOjRiMtTdC07","popularSearches":[{"term":"math","label":"Mathematics","__typename":"PopularSearch"},{"term":"physics","label":"Physics","__typename":"PopularSearch"},{"term":"black holes","label":"Black Holes","__typename":"PopularSearch"},{"term":"evolution","label":"Evolution","__typename":"PopularSearch"}],"searchTopics":[{"type":"Tag","label":"Podcasts","tag":{"name":"podcast","slug":"podcast","term_id":"552","__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Tag","label":"Columns","tag":{"name":"Quantized Columns","slug":"quantized","term_id":"551","__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Series","label":"Series","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Interviews","tag":{"name":"Q&A","slug":"qa","term_id":"567","__typename":"Term"},"category":{"name":"Q&A","slug":"qa","term_id":"176","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Multimedia","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":"Multimedia","slug":"multimedia","term_id":"43","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Puzzles","tag":{"name":"puzzles","slug":"puzzles","term_id":"542","__typename":"Term"},"category":{"name":"Puzzles","slug":"puzzles","term_id":"546","__typename":"Term"},"__typename":"SearchTopic"},{"type":"Category","label":"Blog Posts","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":"Abstractions blog","slug":"abstractions","term_id":"619","__typename":"Term"},"__typename":"SearchTopic"},{"type":"news","label":"News Articles","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"},{"type":"videos","label":"Videos","tag":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"category":{"name":null,"slug":null,"term_id":null,"__typename":"Term"},"__typename":"SearchTopic"}],"searchSections":[{"name":"Mathematics","slug":"mathematics","term_id":"188","__typename":"Term"},{"name":"Physics","slug":"physics","term_id":"189","__typename":"Term"},{"name":"Biology","slug":"biology","term_id":"191","__typename":"Term"},{"name":"Computer Science","slug":"computer-science","term_id":"190","__typename":"Term"}],"searchAuthors":[{"id":"38171","name":"Adam Becker","__typename":"AuthorList"},{"id":"28087","name":"Adam Mann","__typename":"AuthorList"},{"id":"29794","name":"Alex Kontorovich","__typename":"AuthorList"},{"id":"39302","name":"Alexander Hellemans","__typename":"AuthorList"},{"id":"56","name":"Alla Katsnelson","__typename":"AuthorList"},{"id":"29458","name":"Allison Whitten","__typename":"AuthorList"},{"id":"73","name":"Amanda Gefter","__typename":"AuthorList"},{"id":"39164","name":"Ana Kova","__typename":"AuthorList"},{"id":"59","name":"Andreas von Bubnoff","__typename":"AuthorList"},{"id":"8728","name":"Anil Ananthaswamy","__typename":"AuthorList"},{"id":"11648","name":"Ann Finkbeiner","__typename":"AuthorList"},{"id":"95","name":"Ariel Bleicher","__typename":"AuthorList"},{"id":"15493","name":"Ashley Smart","__typename":"AuthorList"},{"id":"450","name":"Ashley Yeager","__typename":"AuthorList"},{"id":"36490","name":"Ben Brubaker","__typename":"AuthorList"},{"id":"16315","name":"Bill Andrews","__typename":"AuthorList"},{"id":"2752","name":"Bob Henderson","__typename":"AuthorList"},{"id":"15492","name":"Brendan Z. Foster","__typename":"AuthorList"},{"id":"68","name":"Brooke Borel","__typename":"AuthorList"},{"id":"62","name":"Carl Zimmer","__typename":"AuthorList"},{"id":"13691","name":"Caroline Lee","__typename":"AuthorList"},{"id":"13684","name":"Caroline Lee","__typename":"AuthorList"},{"id":"50","name":"Carrie Arnold","__typename":"AuthorList"},{"id":"15142","name":"Chanda Prescod-Weinstein","__typename":"AuthorList"},{"id":"8084","name":"Charlie Wood","__typename":"AuthorList"},{"id":"742","name":"Christie Wilcox","__typename":"AuthorList"},{"id":"11543","name":"Claudia Dreifus","__typename":"AuthorList"},{"id":"57","name":"Courtney Humphries","__typename":"AuthorList"},{"id":"7262","name":"Dalmeet Singh Chawla","__typename":"AuthorList"},{"id":"70","name":"Dan Falk","__typename":"AuthorList"},{"id":"19918","name":"Dana Najjar","__typename":"AuthorList"},{"id":"32676","name":"Daniel S. Freed","__typename":"AuthorList"},{"id":"13724","name":"David H. Freedman","__typename":"AuthorList"},{"id":"26310","name":"David S. Richeson","__typename":"AuthorList"},{"id":"30207","name":"David Tse","__typename":"AuthorList"},{"id":"19266","name":"Devin Powell","__typename":"AuthorList"},{"id":"13251","name":"Diana Kwon","__typename":"AuthorList"},{"id":"17000","name":"Elena Renken","__typename":"AuthorList"},{"id":"17149","name":"Elizabeth Landau","__typename":"AuthorList"},{"id":"5279","name":"Elizabeth Preston","__typename":"AuthorList"},{"id":"58","name":"Elizabeth Svoboda","__typename":"AuthorList"},{"id":"32612","name":"Ellen Horne","__typename":"AuthorList"},{"id":"27534","name":"Emily Buder","__typename":"AuthorList"},{"id":"25173","name":"Emily Levesque","__typename":"AuthorList"},{"id":"64","name":"Emily Singer","__typename":"AuthorList"},{"id":"47","name":"Erica Klarreich","__typename":"AuthorList"},{"id":"14784","name":"Erika K. Carlson","__typename":"AuthorList"},{"id":"98","name":"Esther Landhuis","__typename":"AuthorList"},{"id":"5830","name":"Eva Silverstein","__typename":"AuthorList"},{"id":"6793","name":"Evelyn Lamb","__typename":"AuthorList"},{"id":"75","name":"Ferris Jabr","__typename":"AuthorList"},{"id":"52","name":"Frank Wilczek","__typename":"AuthorList"},{"id":"69","name":"Gabriel Popkin","__typename":"AuthorList"},{"id":"77","name":"George Musser","__typename":"AuthorList"},{"id":"19092","name":"Grant Sanderson","__typename":"AuthorList"},{"id":"20557","name":"Howard Lee","__typename":"AuthorList"},{"id":"66","name":"Ingrid Daubechies","__typename":"AuthorList"},{"id":"85","name":"Ivan Amato","__typename":"AuthorList"},{"id":"37141","name":"Jake Buehler","__typename":"AuthorList"},{"id":"12170","name":"Janna Levin","__typename":"AuthorList"},{"id":"32","name":"Jeanette Kazmierczak","__typename":"AuthorList"},{"id":"51","name":"Jennifer Ouellette","__typename":"AuthorList"},{"id":"72","name":"John Pavlus","__typename":"AuthorList"},{"id":"16475","name":"John Preskill","__typename":"AuthorList"},{"id":"91","name":"John Rennie","__typename":"AuthorList"},{"id":"10351","name":"Jonathan Lambert","__typename":"AuthorList"},{"id":"31716","name":"Jonathan O'Callaghan","__typename":"AuthorList"},{"id":"1241","name":"Jordana Cepelewicz","__typename":"AuthorList"},{"id":"8463","name":"Joshua Roebke","__typename":"AuthorList"},{"id":"49","name":"Joshua Sokol","__typename":"AuthorList"},{"id":"16815","name":"jye","__typename":"AuthorList"},{"id":"67","name":"K.C. Cole","__typename":"AuthorList"},{"id":"37462","name":"Karmela Padavic-Callaghan","__typename":"AuthorList"},{"id":"87","name":"Kat McGowan","__typename":"AuthorList"},{"id":"36139","name":"Katarina Zimmer","__typename":"AuthorList"},{"id":"20556","name":"Katherine Harmon Courage","__typename":"AuthorList"},{"id":"90","name":"Katia Moskvitch","__typename":"AuthorList"},{"id":"39551","name":"Katie McCormick","__typename":"AuthorList"},{"id":"27374","name":"Kelsey Houston-Edwards","__typename":"AuthorList"},{"id":"40","name":"Kevin Hartnett","__typename":"AuthorList"},{"id":"38413","name":"Lakshmi Chandrasekaran","__typename":"AuthorList"},{"id":"12570","name":"Laura Poppick","__typename":"AuthorList"},{"id":"38699","name":"Leila Sloman","__typename":"AuthorList"},{"id":"23451","name":"Liam Drew","__typename":"AuthorList"},{"id":"79","name":"Liz Kruesi","__typename":"AuthorList"},{"id":"38","name":"Lucy Reading-Ikkanda","__typename":"AuthorList"},{"id":"60","name":"Maggie McKee","__typename":"AuthorList"},{"id":"2333","name":"Mallory Locklear","__typename":"AuthorList"},{"id":"3569","name":"Marcus Woo","__typename":"AuthorList"},{"id":"414","name":"Mark Kim-Mulgrew","__typename":"AuthorList"},{"id":"20495","name":"Matt Carlstrom","__typename":"AuthorList"},{"id":"17147","name":"Matthew Hutson","__typename":"AuthorList"},{"id":"30953","name":"Max G. Levy","__typename":"AuthorList"},{"id":"32437","name":"Max Kozlov","__typename":"AuthorList"},{"id":"7186","name":"Melinda Wenner Moyer","__typename":"AuthorList"},{"id":"14093","name":"Michael Harris","__typename":"AuthorList"},{"id":"34","name":"Michael Kranz","__typename":"AuthorList"},{"id":"23","name":"Michael Moyer","__typename":"AuthorList"},{"id":"74","name":"Michael Nielsen","__typename":"AuthorList"},{"id":"19093","name":"Michele Bannister","__typename":"AuthorList"},{"id":"1472","name":"Moira Chas","__typename":"AuthorList"},{"id":"6476","name":"Monique Brouillette","__typename":"AuthorList"},{"id":"35407","name":"Mordechai Rorvig","__typename":"AuthorList"},{"id":"10","name":"Natalie Wolchover","__typename":"AuthorList"},{"id":"37605","name":"Nick Thieme","__typename":"AuthorList"},{"id":"37428","name":"Nima Arkani-Hamed","__typename":"AuthorList"},{"id":"19962","name":"Nola Taylor Redd","__typename":"AuthorList"},{"id":"24","name":"Olena Shmahalo","__typename":"AuthorList"},{"id":"1816","name":"Patrick Honner","__typename":"AuthorList"},{"id":"84","name":"Peter Byrne","__typename":"AuthorList"},{"id":"55","name":"Philip Ball","__typename":"AuthorList"},{"id":"31","name":"Pradeep Mutalik","__typename":"AuthorList"},{"id":"24011","name":"Puja Changoiwala","__typename":"AuthorList"},{"id":"100","name":"Quanta Magazine","__typename":"AuthorList"},{"id":"2784","name":"R. Douglas Fields","__typename":"AuthorList"},{"id":"26114","name":"Rachel Crowell","__typename":"AuthorList"},{"id":"9412","name":"Raleigh McElvery","__typename":"AuthorList"},{"id":"820","name":"Ramin Skibba","__typename":"AuthorList"},{"id":"1666","name":"Rebecca Boyle","__typename":"AuthorList"},{"id":"20950","name":"Richard Masland","__typename":"AuthorList"},{"id":"48","name":"Robbert Dijkgraaf","__typename":"AuthorList"},{"id":"80","name":"Roberta Kwok","__typename":"AuthorList"},{"id":"15681","name":"Robin George Andrews","__typename":"AuthorList"},{"id":"24577","name":"Rodrigo Pérez Ortega","__typename":"AuthorList"},{"id":"78","name":"Sabine Hossenfelder","__typename":"AuthorList"},{"id":"83","name":"Sarah Lewin","__typename":"AuthorList"},{"id":"35441","name":"Scott Aaronson","__typename":"AuthorList"},{"id":"76","name":"Sean B. Carroll","__typename":"AuthorList"},{"id":"15680","name":"Sean Carroll","__typename":"AuthorList"},{"id":"7239","name":"Shannon Hall","__typename":"AuthorList"},{"id":"65","name":"Siobhan Roberts","__typename":"AuthorList"},{"id":"5944","name":"Sophia Chen","__typename":"AuthorList"},{"id":"61","name":"Steph Yin","__typename":"AuthorList"},{"id":"63","name":"Stephanie Bucklin","__typename":"AuthorList"},{"id":"26311","name":"Stephanie DeMarco","__typename":"AuthorList"},{"id":"71","name":"Stephen Ornes","__typename":"AuthorList"},{"id":"17148","name":"Steve Nadis","__typename":"AuthorList"},{"id":"13356","name":"Steven Strogatz","__typename":"AuthorList"},{"id":"17150","name":"Susan D'Agostino","__typename":"AuthorList"},{"id":"39768","name":"Tamar Lichter Blanks","__typename":"AuthorList"},{"id":"2960","name":"Tara C. Smith","__typename":"AuthorList"},{"id":"14785","name":"Thomas Lewton","__typename":"AuthorList"},{"id":"3","name":"Thomas Lin","__typename":"AuthorList"},{"id":"54","name":"Tim Vernimmen","__typename":"AuthorList"},{"id":"88","name":"Tom Siegfried","__typename":"AuthorList"},{"id":"12964","name":"Vanessa Schipani","__typename":"AuthorList"},{"id":"53","name":"Veronique Greenwood","__typename":"AuthorList"},{"id":"86","name":"Virginia Hughes","__typename":"AuthorList"},{"id":"3244","name":"Viviane Callier","__typename":"AuthorList"},{"id":"89","name":"Wynne Parry","__typename":"AuthorList"},{"id":"15913","name":"XiaoZhi Lim","__typename":"AuthorList"}],"adBehavior":"everywhere","adUrl":"https://www.quantamagazine.org/gift-store","adAlt":"Give the gifts of science and math this holiday season.","adImageHome":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2020/12/2020Holiday_Web-Default_260x384.gif","adImageArticle":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2020/12/2020Holiday_Article_160x300.gif","adImageTablet":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2020/12/2020Holiday_Tablet_1780.jpg","adImageMobile":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2020/12/2020Holiday_Web-Default_260x384.gif","trackingScripts":"rn"},"theme":{"page":{"accent":"#ff8600","text":"#1a1a1a","background":"white"},"header":{"type":"default","gradient":{"color":"white"},"solid":{"primary":"#1a1a1a","secondary":"#999999","hover":"#ff8600"},"transparent":{"primary":"white","secondary":"white","hover":"#ff8600"}}},"redirect":null,"fallbackImage":{"alt":"","caption":"","url":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","width":1200,"height":600,"sizes":{"thumbnail":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-520x260.gif","square_small":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-160x160.gif","square_large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-520x520.gif","medium":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","medium_large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default-768x384.gif","large":"https://d2r55xnwy6nx47.cloudfront.net/uploads/2017/04/default.gif","__typename":"ImageSizes"},"__typename":"Image"}},"modals":{"loginModal":false,"signUpModal":false,"forgotPasswordModal":false,"resetPasswordModal":false,"lightboxModal":false,"callback":null,"props":null},"podcast":{"id":null,"playing":false,"duration":0,"currentTime":0},"user":{"loggedIn":false,"savedArticleIDs":[],"userEmail":"","editor":false},"comments":{"open":false},"cookies":{"acceptedCookie":false}},
env: {
APP_URL: 'https://www.quantamagazine.org',
NODE_ENV: 'production',
WP_URL: 'https://api.quantamagazine.org',
HAS_GOOGLE_ID: true,
HAS_FACEBOOK_ID: true,
},
}



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *