Articles of Interest in Science

Current issues, news and ethics
Post Reply
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Thanks for input Karim.
In fact what ever is in macrocosm is also in the microcosm.

"It is He Who created the night and the day, and the sun and the moon; all (the celestial bodies) swim along, each in its rounded course" (21:33). "The heavens, We have built them with power. And verily, We are expanding it" (51:47).
According to Quran, the Universe is expanding so are the capabilities of mind.

There is so much focus on this world of ours, but what about the world beyond? There is an entire universe beyond us. Billions of galaxies like ours. Stars and Planets. Everything and so much more, more than our human minds could ever imagine.

In a world where everything is changing so rapidly that we forget to ponder about that which is beyond our eyes.
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Science
Pentagon worries that satellite attacks could spark ‘mutually assured destruction’
Alan Boyle
GeekWireFri, November 6, 2020, 7:46 PM CST

Satellite constellations are becoming increasingly important for military communications. (DARPA Illustration)
In the years ahead, the long-running nightmare of the nuclear Cold War — mutually assured destruction — could return in a new context on the final frontier, a Pentagon adviser said today at a Seattle-based space policy conference.

Brad Townsend, a space strategy and policy adviser to the leadership of the Joint Chiefs of Staff, raised the alarm about anti-satellite weapons, or ASATs, during a virtual symposium sponsored by the University of Washington’s Space Policy and Research Center.

He noted that China and Russia are already experimenting with methods to disable other nations’ satellites in the event of a future conflict. But in the course of destroying an enemy satellite, attackers could set off a catastrophic chain reaction of out-of-control orbital debris.

Such a phenomenon, sometimes referred to as the Kessler syndrome, has fed into the plotlines for movies such as “Gravity” and novels such as “SevenEves.” But Townsend warned that the threat is more than just a science-fiction possibility.

“If nations start arming with ASATs as a way to deter other nations from attacking their orbital assets, they risk creating a new form of mutually assured destruction,” he said.

Townsend said the prospect of setting off a Kessler syndrome should have caused the world’s space powers to back away from the technology. “But as India’s 2019 test demonstrated, it hasn’t,” he said.

So what is to be done? One step would be to create an international system for sharing information about orbiting satellites, in order to head off unintended collisions. Another would be to encourage the development of space systems that could move satellites to orbital graveyards once they go out of operation — systems like Northrop Grumman’s MEV-1 satellite tug.

But to head off an intentional satellite attack, Townsend said the world’s nations would have to agree to ban the use of anti-satellite weapons, just as they’ve banned the use of biological weapons. “The time is right for de-escalation efforts before we have that future event,” he said.


In international talks about space weapons, the United States favors an approach known as transparency and confidence-building measures, or TCBM. China and Russia, meanwhile, have their own proposal for a treaty on the prevention of placement of weapons in outer space, known as PPWT. Each approach has run into opposition from the other side.

Matthew Stubbs, an expert in space law at the University of Adelaide in Australia, said there’s “considerable pessimism about the prospects of multilateral rulemaking for space at the moment.” He said the most likely scenario for resolving the issue involves a series of bilateral and multilateral agreements. NASA is taking such an approach for the Artemis Accords, a set of agreements that are expected to govern future moon exploration.

The space weapons issue illustrates how quickly the space frontier is becoming “a contested domain,” said Lt. Gen. John Shaw, who is the commander of the Combined Force Space Component Command as well as the commander of the U.S. Space Force’s Space Operations Command.

When the Pentagon began building satellite systems for command and control, “we built them as if we were in a benign domain,” Shaw said. But potential adversaries were quick to take note of the U.S. military’s growing reliance on space capabilities — which led to the Trump administration’s creation of the Space Force as a separate military branch last year.

More from the SPARC symposium: Blue Origin fleshes out plan for 2023 cargo delivery to the moon
Would space policy change if Joe Biden becomes president next year, as expected based on the results of this week’s election? Neither Shaw nor Townsend addressed that question — but Saadia Pekkanen, co-director of the Space Policy and Research Center, said Biden was “likely to stay the course.”

“If you are imagining that there might be a radical shift in space policy, I don’t quite see that,” she said.

During today’s keynote session, Sen. Maria Cantwell, D-Wash., noted that Congress still has to approve a significant piece of legislation pertaining to space policy: the NASA Authorization Act of 2019. “I can’t promise you that it’s going to get done in a lame-duck session of Congress,” she said, “but if it doesn’t, I will guarantee you it will be done in the very early part of 2021.”

Cantwell, who’s the ranking member of the Senate Commerce, Science and Transportation Committee, said the authorization bill would smooth the way for NASA to boost its support for landing systems capable of putting astronauts on the moon’s surface. That could include the landing system that Amazon CEO Jeff Bezos’ Blue Origin space venture is developing along with industry partners such as Lockheed Martin and Northrop Grumman.

The senator noted that Blue Origin, which is based in Kent, Wash., has become a prominent player in Washington state’s space industry, which accounts for $1.8 billion of economic activity annually. Other players include SpaceX, which is building its Starlink satellites at its facility in Redmond, Wash.; and Aerojet Rocketdyne’s Redmond operation, which is building rocket thrusters for future NASA missions.

“It’s not surprising that all of those efforts have led recently to our state being called ‘the Silicon Valley of Space,'” Cantwell said.

When it comes to future space exploration, NASA’s Artemis program to put astronauts on the moon looms largest on the horizon. The Trump administration has pressed NASA to execute the program’s first crewed landing by 2024, but Cantwell said that deadline might slip.

“We’re very excited about Artemis in general. … There’s not always consensus about when and what time frame we should have to meet this Artemis goal,” Cantwell said. Hitting the 2024 deadline “would require an enormous amount of resources.”

Wendy Whitman Cobb, an associate professor of strategy and security studies at the U.S. Air Force School of Advanced Air and Space Studies, said the Artemis program was likely to continue even if the White House changes hands, as expected, but with a different timetable.

“A Biden administration might be a little bit better at letting that go a little bit. … If anything, I think we might see a little bit more emphasis on the commercial capabilities and commercialization of space on the part of NASA,” she said. “That could be, just because there’s that natural sort of flow over from Vice President Biden’s experience with the Obama administration.”

https://currently.att.yahoo.com/finance ... 14699.html
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Asteroid the size of Dubai’s Burj Khalifa heading towards Earth at 90,000kph
23 Nov, 2020 16:41

Asteroid the size of Dubai’s Burj Khalifa heading towards Earth at 90,000kph
FILE IMAGES. Main: © Getty Images / All About Space Magazine ; inset: © Global Look Press/Martin Siepmann

As a battered and beleaguered planet approaches the finish line of 2020, the universe appears to be playing a dark joke on humanity by sending five asteroids in two days, with a 2,700ft space rock to follow just days later.
The behemoth dubbed 2000 WO107, estimated to measure up to 0.51km in diameter, or roughly the height of Dubai’s Burj Khalifa, the tallest building in the world, will be paying Earth a visit (cosmically speaking) at 10.09am GMT (5.09am ET) on Sunday, November 29.

The visitor won’t be sticking around for long as, despite its enormous size, it is traveling at a whopping 25.07km per second – or roughly the equivalent of 90,000kph. For reference, the average bullet travels at around 4,500kph. It also won’t be visible to most Earthlings as it is set for what NASA deems a “close flyby” at a paltry 0.02876 Astronomical Units (AU) from Earth – or 2,673,409 miles.

NASA defines a Near Earth Object (NEO) as any asteroid or comet coming within 1.3 AU of Earth.

NASA flag FIVE asteroids en route TODAY, as scientists mull planetary defense mission to planet-killer Apophis in 2029
In the meantime, there will be plenty of activity in our cosmic backyard, with five asteroids due in the area on Monday and Tuesday alone.

On Monday three space rocks, 2020 WN (9.5m in diameter), 2020 VW2 (14m) and 2020 WC (10m) will shoot past Earth at 1.6 million kilometers, seven million kilometers, and 1.6 million kilometers respectively.

The average distance between Earth and the Moon is about 385,000km but NASA considers all objects within a 7.5 million-kilometer range (or 19.5 times the distance to the Moon) to be worthy of monitoring, just in case any succumb to a sudden course correction.

Tuesday will see the 47-meter 2017 WJ16 and 31-meter 2020 TJ8 pass the planet at two million kilometers and 6.4 million kilometers, before the next round of space rocks buzzes our planetary defenses.

NASA and other space agencies around the globe keep a watchful eye 24/7 but there’s a lot of sky to cover, so sometimes humanity misses these close flybys, as was the case with a new record-holding close flyby which took place this month

https://www.rt.com/news/507562-asteroid ... rth-flyby/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Can a Computer Devise a Theory of Everything?

It might be possible, physicists say, but not anytime soon. And there’s no guarantee that we humans will understand the result.


Once upon a time, Albert Einstein described scientific theories as “free inventions of the human mind.” But in 1980, Stephen Hawking, the renowned Cambridge University cosmologist, had another thought. In a lecture that year, he argued that the so-called Theory of Everything might be achievable, but that the final touches on it were likely to be done by computers.

“The end might not be in sight for theoretical physics,” he said. “But it might be in sight for theoretical physicists.”

The Theory of Everything is still not in sight, but with computers taking over many of the chores in life — translating languages, recognizing faces, driving cars, recommending whom to date — it is not so crazy to imagine them taking over from the Hawkings and the Einsteins of the world.

Computer programs like DeepMind’s AlphaGo keep discovering new ways to beat humans at games like Go and chess, which have been studied and played for centuries. Why couldn’t one of these marvelous learning machines, let loose on an enormous astronomical catalog or the petabytes of data compiled by the Large Hadron Collider, discern a set of new fundamental particles or discover a wormhole to another galaxy in the outer solar system, like the one in the movie “Interstellar”?

At least that’s the dream. To think otherwise is to engage in what the physicist Max Tegmark calls “carbon chauvinism.” In November, the Massachusetts Institute of Technology, where Dr. Tegmark is a professor, cashed a check from the National Science Foundation, and opened the metaphorical doors of the new Institute for Artificial Intelligence and Fundamental Interactions.

The institute is one of seven set up by the foundation and the U.S. Department of Agriculture as part of a nationwide effort to galvanize work in artificial intelligence. Each receives $20 million over five years.

The M.I.T.-based institute, directed by Jesse Thaler, a particle physicist, is the only one specifically devoted to physics. It includes more than two dozen scientists, from all areas of physics, from M.I.T., Harvard, Northeastern University and Tufts.

“What I’m hoping to do is create a venue where researchers from a variety of different fields of physics, as well as researchers who work on computer science, machine-learning or A.I., can come together and have dialogue and teach each other things,” Dr. Thaler said over a Zoom call. “Ultimately, I want to have machines that can think like a physicist.”

More...

https://www.nytimes.com/2020/11/23/scie ... 778d3e6de3
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Scientists identify gene responsible for aging in potential leap forward for regenerative medicine
1 Dec, 2020 20:26

Researchers have identified the process that causes the skin and other human organs to age, in what could prove a major breakthrough for the field of regenerative medicine.
Through "cellular reprogramming," a team from the University of Wisconsin-Madison discovered the mechanism that controls the aging and rejuvenation of mesenchymal stem cells (MSCs), which can change into a variety of different cell types, such as muscle or bone.

Their study, published in the journal Stem Cells on Monday, adds to what scientists already knew about how cellular processes cause MSCs to age.

"Our study goes further to provide insight into how reprogrammed MSCs are regulated molecularly to ameliorate the cellular hallmarks of aging," said one of its co-authors Wan-Ju Li.

"We believe our findings will help improve the understanding of MSC aging and its significance in regenerative medicine."

Israeli scientists claim they’ve partially REVERSED cellular aging process for 1st time in history
Regenerative medicine is concerned with re-growing, replacing and healing organs or tissue damaged by age and disease.

The researchers extracted MSCs from human synovial fluid, the body's natural lubricant found around joints, including the knees and elbows. They then 'reprogrammed' MSCs into a form of stem cell converted into an embryonic-like state, offering the scientists the possibility of creating any cell within the adult body.

The team found that a protein (GATA6) was repressed in the changed MSCs cells, leading to an increase in another protein (known as the SHH) and the expression of another (called FOXP1), which is active in the development of the brain, heart and lungs.

Dr Jan Nolta, editor of Stem Cells, hailed the identification of this protein pathway in controlling the aging of MSCs as a "very important accomplishment."

The discovery comes after a separate recent study from researchers at Israel's University of Tel Aviv, who claimed to have reversed the cellular aging process, in a world-first.

But compared to the University of Wisconsin-Madison study, the Israeli study is more controversial due to the process it used to shorten telomeres – the small tips at the end of each chromosome. Later on in life, this process can lead to an increased risk of age-related diseases, including coronary heart disease, diabetes and cancer.

https://www.rt.com/news/508387-gene-tha ... iscovered/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

They Made the ‘Pfizer Vaccine’

Hosted by Kara Swisher

While you’re waiting for a coronavirus vaccine, get to know the German power couple and co-founders of BioNTech who are behind the vaccine that Britain has just approved.


Listen to podcast and read the transcript at:

https://www.nytimes.com/2020/12/03/opin ... 778d3e6de3

Dr. Ozlem Tureci and Dr. Ugur Sahin, the co-founders of BioNTech, are behind the first coronavirus vaccine to be approved in the West. Starting next week, the “Pfizer vaccine” will be available in Britain.

While Pfizer is financing and distributing the vaccine, the science behind it was actually spearheaded by the couple’s lesser-known company. When Drs. Tureci and Sahin, along with their BioNTech team, embarked on this mission, the record for the fastest vaccine creation was four years. They did it in less than one.

BioNTech started working on a vaccine in January. By early November, the company shared the results of its Phase 3 trials: over 90 percent efficacy. The announcement was made days after the presidential election was called for Joe Biden, and Donald Trump claimed the timing was politically motivated.
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

For the First Time in 800 Years, a “Christmas Star” Will Illuminate the Skies This Month
Meghan Overdeep
Southern Living Thu, December 3, 2020, 3:06 PM CST
For the First Time in 800 Years, a “Christmas Star” Will Illuminate the Skies This Month
Jupiter and Saturn are set to put on quite the show on December 21.

The Solar System’s two largest planets are on course to “collide” in an exceedingly rare event not seen since the Middle Ages.

On December 21, Jupiter and Saturn will come so close together that they will look like a “double planet,” a conjunction that will appear to us on Earth like one super-bright point of light.

“Alignments between these two planets are rather rare, occurring once every 20 years or so, but this conjunction is exceptionally rare because of how close the planets will appear to be to one another,” Patrick Hartigan, astronomer at Rice University, told Forbes. “You’d have to go all the way back to just before dawn on March 4, 1226, to see a closer alignment between these objects visible in the night sky.”

This incredible phenomenon has been known historically as the “Christmas Star” or “Star of Bethlehem.”

In fact, it’s theorized by many that the fabled star in the story of the Three Wise Men could have been a rare triple conjunction of Jupiter, Saturn, and Venus.

Hoping to catch a peek of the “great conjunction” yourself? According to Forbes, the event will be observable anywhere where skies are clear for about an hour after sunset on December 21, the Winter Solstice. Be sure to mark your calendars, a conjunction this close won’t happen again until March 15, 2080!

https://currently.att.yahoo.com/att/fir ... 33473.html
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Science
Isaac Newton Believed the Pyramids Revealed the Timing of the Apocalypse
Caroline Delbert
Popular Mechanics Mon, December 7, 2020, 4:17 PM CST
From Popular Mechanics

Sir Isaac Newton believed the pyramids at Giza held the key to the end of the world.

Pyramidology is the broad pseudoscientific study of the "secrets" of the pyramids.

Newton faced religious persecution for combining Christianity and pyramidology.

Soon, a set of obscure notes by foundational physicist Sir Isaac Newton will be auctioned at Sotheby’s. Yes, Newton “discovered gravity” in the popular imagination, but it’s his obsessive interest in pyramidology that shows in the previously undiscovered notes.

Like many people throughout history, Newton believed the pyramids in Egypt had spiritual and metaphysical significance in some way, and he studied them for years to try to reveal their “secrets.” The Guardian has more:

Newton was trying to uncover the unit of measurement used by those constructing the pyramids. He thought it was likely that the ancient Egyptians had been able to measure the Earth and that, by unlocking the cubit of the Great Pyramid, he too would be able to measure the circumference of the Earth.

He hoped that would lead him to other ancient measures, allowing him to uncover the architecture and dimensions of the Temple of Solomon—the setting of the apocalypse—and interpret the Bible’s hidden meanings.

It’s easy to reverse engineer a history where people like Newton were so-called pure scientists—untroubled by religious politics, pseudoscience, and other areas we don’t consider part of science today. But for thousands of years, the most rational scientists were still grappling with complicated problems that made it hard to rule out these more extreme theories. Without cell or atomic theory, for example, how could they conclude that miasma theory and the classical elements weren’t accurate?

All this is to say that even Newton was deep in his pseudoscience cups, studying alchemy and searching for esoteric meaning in different places. The pyramids of ancient Egypt, specifically the most photogenic and iconic batch at Giza, have always been a magnet for esoterica. In fact, the field of pseudoscience speculation about the pyramids is so large that, like ufology, it has its own name: pyramidology.

For Newton, pyramidology related to his interest in alchemy and even his underlying Christian beliefs. He thought the pyramids held secrets to the details of the specific Christian apocalypse detailed in the book of Revelations. And during Newton’s lifetime, mixing the default Christian beliefs with science or pseudoscience was verboten. Even though his research was directly related to how fervently he believed in the Bible, he had to hide these studies from everyone.

Photo credit: Sotheby's
Photo credit: Sotheby's
The book of Revelations is a drastic tonal and subject matter departure from the rest of the Bible books that precede it, and it’s stuffed with symbolism and numbers that have intrigued scholars for centuries. They’ve considered the puzzling book to be literally a puzzle, one whose solution could save the world. This is where Newton, too, found that he was seeking complex answers.

Why are the pyramids involved in this story? Well, for a similar reason. The pyramids are such a staggering human achievement, surrounded by the rich cultural history of the ancient Egyptians who built them, that they became a magnet for western European exoticist beliefs.

People like Newton decided Egyptians had secret, esoteric knowledge that had been lost. They looked at the pyramids as a living puzzle the same way Revelations was, with elements that represented a code they believed they saw.

People see patterns everywhere as part of what makes the human brain so powerful and special, and for Newton, it was much less clear that some patterns were pseudo rather than science. Indeed, some lucky auction buyer will get to read all about Newton's pyramidologist ideas up close.

https://currently.att.yahoo.com/finance ... 00103.html
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

swamidada wrote: Why are the pyramids involved in this story? Well, for a similar reason. The pyramids are such a staggering human achievement, surrounded by the rich cultural history of the ancient Egyptians who built them, that they became a magnet for western European exoticist beliefs.

People like Newton decided Egyptians had secret, esoteric knowledge that had been lost. They looked at the pyramids as a living puzzle the same way Revelations was, with elements that represented a code they believed they saw.
Very interesting article. As I have always maintained, Duapur Yuga during which the pyramids were built was far more advanced. Hence there would be a lot of secrets to unravel and this will happen as we advance towards more enlightened Yugas.

So far no one has come up with a satisfactory explanation of how and why the pyramids were built.
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

JANUARY 02, 2021
Muslims and technology
Pervez Hoodbhoy

EXCEPT for some defiant holdouts, most Muslims have come to accept the printing press, loudspeaker, weather forecasts, cameras and television, blood transfusions, organ transplants, and in-vitro fertilization. Earlier fears that technology will destroy their faith are disappearing. Although religious extremists have killed polio vaccine workers by the dozens, Pakistanis are likely to accept the Covid vaccine more easily than Americans. This is progress.

Technology for religious rituals is also becoming popular. For example, you may buy a small gadget called the Salat Card which uses proximity sensors to count the number of rakats performed during prayers. Also available online is an environmentally friendly wuzu (ablution) machine using visual sensors. Responding to public complaints of muezzins with rasping voices or bad pronunciations, Egypt’s government is carrying out an experimental airing in 113 mosques of Cairo where a computer will initiate the standardized azan at exact times. A few years ago multiple fatwas would have lambasted such innovations. But not anymore.

Without a culture of science Muslims will continue consuming technology without producing much.

What of science, the fount of technology? Consuming technology does not, of course, resolve conflicts between science and religion. Nor does it necessarily mean that science as a way of looking at the world is gaining ascendancy. The latter motivated the 2020 Task Force Report on Culture of Science in the Islamic World. Led by Prof Nidhal Guessoum (Sharjah) and Dr Moneef Zou’bi (Jordan), with input from Dr Athar Osama (Pakistan), their online survey gives some hints.

At one level the results are encouraging. Their survey of 3,500 respondents, chosen mostly from Arab countries and Pakistan, shows knowledge of basic scientific facts as slightly better than in developed countries. The authors concede that this surprising result is probably because relatively educated and internet-savvy respondents were chosen. Still, one hopes that this is not too inaccurate.

But even if true, knowing facts about science is unconnected with having a scientific mindset. The difference is like that between a USB memory stick (where you dump data) and a CPU chip (which is the decision-making brain of your laptop or smartphone). The first is passive, relatively simple and cheap. The second is active, extremely complex and costly.

Correspondingly, the traditional mindset takes knowledge to be a corpus of eternal verities to be acquired, stockpiled, disseminated, understood and applied but not modified or transformed. The scientific mindset, on the other hand, involves ideas of forming, testing and, if necessary, abandoning hypotheses if they don’t work. Analytical reasoning and creativity is important, simple memorization is not.

Going through the report, it is unclear to me whether the questions asked — and the answers received — have helped us understand whether Muslims are moving towards a scientific worldview. Perhaps the organizers thought that asking difficult questions upfront is too dangerous. But the strong emphasis they place upon freedom, openness and diversity as a condition for nurturing science is praiseworthy.

Here’s why science — and developing a scientific mindset — is so difficult and alien. Humans are never completely comfortable with science because it is not common sense. In our daily lives one sometimes has to struggle against science. As children we learned that actually it’s the earth that moves and yet we still speak of the sun rising and setting!

Another example: heavier things fall faster, right? This is so obviously true that nobody tested it until Galileo showed 400 years ago that this is wrong. Wouldn’t it be so much nicer if the laws of physics and biology lined up with our naïve intuition and religious beliefs? Or if Darwin was wrong and living things didn’t evolve through random mutation? Unfortunately, science is chock-full of awkward facts. Getting to the truth takes a lot of work. And so you have to be very thorough and very curious.

Historically, lack of curiosity is why Muslim civilizations were ultimately defeated. After the Arab Golden Age petered out, the spirit of science also died. The 17th-century Ottoman sultans were rich enough to hire technologists from Europe to build ships and cannons (there were no Chinese then) but they could not produce their own experts from the ulema-dominated educational system. And when the British East India Company brought inventions and products from an England humming with new scientific ideas, the Mughals simply paid cash for them. They never asked what makes the gadgets work or even how they could be duplicated.

Without a scientific culture, a country can only consume and trade. Pakistan functions as a nation of shopkeepers, property dealers, managers, hoteliers, accountants, bankers, soldiers, politicians and generals. There are even a few good poets and writers. But there is no Pakistani Covid vaccine. With so few genuine scientists and researchers it produces little new knowledge or products.

That 81 Pakistanis were recently ranked in the world’s top two per cent of scientists by Stanford University turned out to be fake news. Stanford University was not involved in this highly dubious ranking. This was confirmed to me over email last week by Prof John Ioannidis of Stanford University. He, together with three co-authors, had been cited as the source.

What will it take to bring science back into Islam? The way may be similar to how music and Islam — which in principle are incompatible — are handled in Muslim countries today.

It is perfectly usual for a Pakistani FM radio station that is blaring out Lady Gaga’s songs to briefly pause, broadcast a pre-recorded azan in all its dignified solemnity, and then resume with Beyoncé until the next azan. Although the choice of music is quite abysmal, there is a clean separation of the worldly from the divine.

Separation is the key! When Galileo famously said “the Bible teaches us how to go to heaven, not how the heavens go”, he was arguing the domains of science and belief do not overlap. This is how the West, China, and India developed modern scientific cultures. Centuries earlier, Muslim scholars had readily absorbed Greek learning while keeping their religious beliefs strictly personal. This made possible major discoveries and inventions. Whether one likes it or not, there is no other way to develop a culture of science.

The writer is an Islamabad-based physicist and writer.

Published in Dawn, January 2nd, 2021

https://www.dawn.com/news/1599215/musli ... technology
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

BOOK REVIEW

Electrons, Photons, Gluons, Quarks: A Nobel-Winning Physicist Explains It All

FUNDAMENTALS
Ten Keys to Reality
By Frank Wilczek


Whether or not you’re accustomed to reading physics for pleasure, the Nobel laureate Frank Wilczek’s “Fundamentals” might be the perfect book for the winter of this plague year. Early on, Wilczek quotes the 17th-century French physicist and philosopher Blaise Pascal’s lament, “The universe grasps me and swallows me up like a speck.” For Pascal, that thought produced intense spiritual anxiety, but for the contemporary reader it might actually provide a certain comfort: Whatever obscene amount of damage we’ve managed to do here on Earth is insignificant when seen on an astronomical scale. Wilczek has a more optimistic take, though, based on quantifying the space inside us: The number of atoms in a single human body is roughly 1028 — 1 followed by 28 zeros, “a million times the number of stars in the entire visible universe.” He sees potential in our inner vastness, too.

Another way to write that number is 10 octillion, and “Fundamentals” is filled with facts like these — the kind of question adults think they can answer until their children ask. How long until the Earth is swallowed by the sun? How does GPS work? How many thoughts can a person have in a lifetime? (Based on an average speech rate of two words per second, Wilczek estimates approximately a billion.)

Although Wilczek’s voice here is endearingly humble, it’s clear that his mind was never like that of most kids piping up from the back seat. He recalls that one of his “earliest childhood memories is of a small notebook I kept when I was first learning about relativity, on the one hand, and algebra, on the other.” Wilczek grew up in New York City and attended public school in Queens, graduating from high school in two years. As a teenager trailing his mother in the grocery store, he was taken with the brand name of a laundry detergent called Axion, and promised himself that if he ever discovered an elementary particle, he’d give it that name. Incredibly, in 1978, Wilczek did identify a hypothetical particle — one that coincidentally solved a problem related to axial currents — and was able to fulfill that fantasy.

Wilczek was still a graduate student at Princeton when he and David Gross developed the theory of asymptotic freedom, an explanation for the way quarks interact with one another inside the nucleus of an atom, clarifying the workings of the strong force, also called quantum chromodynamics. The theory explained a seeming paradox in the behavior of these elementary particles — that they attract one another more forcefully at a distance than in proximity — a discovery that earned him and Gross, along with David Politzer, the 2004 Nobel Prize.

Wilczek writes with breathtaking economy and clarity, and his pleasure in his subject is palpable. He lays out the elementary particles of matter — electrons, photons, gluons and quarks — and their strikingly short list of properties: mass, charge and spin. He then defines four principles that characterize the four basic forces in nature: electromagnetism, gravity, the strong force and the weak force. Most people vaguely remember electromagnetic fields from high school physics, but Wilczek makes very clear the way that those “space-filling” fields are contiguous with the smallest building blocks of matter: “We now understand particles as manifestations of a deeper, fuller reality. Particles are avatars of fields.” It’s a beautiful description that would be especially evocative for today’s game-fluent high school students.

Sometimes, to see if you understand a concept in physics, it helps to try to explain it to someone else. Wilczek points out that the elementary particles “aren’t even solid bodies. Indeed, though it’s convenient to call them ‘elementary particles,’ they aren’t really particles. … Our modern fundamental ingredients have no intrinsic size or shape.”

In trying to paraphrase this enchanting idea for my husband, I realized that I didn’t actually know how something with no size or shape could have mass. I thought Wilczek might not enlighten me, and then a chapter later he did, articulating the concept this way: “Quarks have very small masses, and gluons have zero mass. But inside protons they are moving around very fast, and thus they carry energy. All that energy adds up. When the accumulated energy is packaged into an object that is at rest overall, such as the proton as a whole, then that object has the mass m=E/c2.” That inverted version of Einstein’s famous formula incidentally is one of the things Wilczek remembers writing down in his childhood notebook. What a reader gets in “Fundamentals” is the native language of physics — mathematics — precisely translated by someone who has spent a lifetime (about a billion thoughts!) on these forces that shape our physical world.

Beyond the facts, “Fundamentals” is full of the kind of heady ideas that keep laypeople reading about contemporary physics: the possibility that the mysterious “dark matter” that makes up 25 percent of the mass of our universe might actually be a remnant of theoretical particles called axions in the very early universe, an invisible cousin of the cosmic microwave background radiation, also a relic of the Big Bang; or the idea that with a biological engineering technique called “modulated self-reproduction” it might be possible to “terraform” a new planet. In a book this far-reaching, it’s understandable that Wilczek spends only a few pages on climate change, focusing mostly on the enormous potential of solar energy. The optimism inherent in chapter titles like “There’s Plenty of Time” and “There’s Plenty of Space” can seem Panglossian next to the reality of what we’re facing on Earth in the next few decades.

I think Wilczek might answer that criticism by talking about complementarity, an idea that he’s elevated to an intellectual credo: “the concept that one single thing, when considered from different perspectives, can seem to have very different or even contradictory properties.” He explains that in physics, when a model becomes too complicated, an alternative model can help answer important questions.

“Fundamentals” offers readers just that sort of radical shift: the way that energy, seen from another angle, is a particle; the way that space-time could be a form of matter; the way that stepping outside a catastrophe to look at it on a cosmic scale might actually be the first step toward a solution.

https://www.nytimes.com/2021/02/08/book ... ks_norm_20
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Mars Mission From the U.A.E. Begins Orbit of Red Planet

The Hope spacecraft fired its engines on Tuesday and was grappled by the planet’s gravity to begin its atmospheric science studies.


The first in a parade of three new visitors to Mars has arrived.

On Tuesday, the United Arab Emirates became just the fifth nation to successfully send a spacecraft to Mars when its robotic probe, named Hope, began orbiting the red planet.

It is the first interplanetary mission undertaken by an Arab country. In recent days, a number of prominent buildings and monuments in the wealthy oil country, which is about the size of Maine, were lit up at night in red in honor of Mars, the red planet.

“From the U.A.E. government’s perspective, basically 90 percent of this mission has been achieved successfully,” said Omran Sharaf, the project manager of Hope in an interview ahead of the spacecraft’s arrival.

For the remaining 10 percent, there was little to do but watch and wait as the spacecraft executed instructions already loaded into its computer.

Sarah al-Amiri, who leads the science portion of the mission, said she had felt a full of range of emotions when the spacecraft was launched last summer. But as it approached Mars, she said, “This is further intensifying them.”.

Once in orbit, the spacecraft can begin its study of the red planet’s atmosphere and weather.

On Tuesday controllers at the mission operations center in Dubai first received word from the spacecraft that it had started firing thrusters to slow itself down and allow it to fall into the thrall of the gravity of Mars. Then, after the 27-minute burn was complete, they confirmed that the probe was in orbit.

Cheers erupted in the control room, where the mission’s managers sat at sleek computer consoles that would be at home on the bridge of a starship from “Star Trek.”

More....

https://www.nytimes.com/2021/02/09/scie ... lanet.html

*******
Einsteinium Is Mysterious. Scientists Have Unlocked Some of Its Secrets.

Number 99 on the periodic table does not occur naturally and is difficult to make and store, challenging researchers who want to study it.


Einsteinium is an element with a famous name that almost no one has heard of.

With 99 protons and 99 electrons, it sits in obscurity near the bottom of the periodic table of chemical elements, between californium and fermium. It first showed up in the explosive debris of the first hydrogen bomb in 1952, and the team of scientists who discovered it gave it a name to honor Albert Einstein.

Even today, scientists know little about it.

Einsteinium is highly radioactive. Because there are no stable versions that do not fall apart within a few years, it is not found in nature. It can be produced in a few specialized nuclear reactors, but only in minute amounts.

Writing in the journal Nature, researchers led by Rebecca J. Abergel, who leads the heavy element chemistry group at Lawrence Berkeley National Laboratory in California, reported on Wednesday that they have now worked out some basic chemical properties of einsteinium.

It was not easy. Indeed, Dr. Abergel described her paper as the culmination of “a long series of unfortunate events.”

More...

https://www.nytimes.com/2021/02/07/scie ... 778d3e6de3
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Watch the last billion years of Earth's tectonic plate movement in
Aylin Woodward
Business Insider Sun, February 14, 2021, 7:18 AM
tectonic plates
A map of Earth's current tectonic plate boundaries. Eric Gaba for Wikimedia Commons
Geologists animated a video that shows how Earth's tectonic plates moved around over the last billion years.

The animation reveals the formations that came before our current seven continents and five oceans.

The land mass that became Antarctica once sat along the Equator. Over Earth's history, several supercontinents have broken up and come back together like the Backstreet Boys.

Our current seven continents and five oceans are the result of more than 3 billion years of planetary evolution, the tectonic plates crisscrossing atop the semi-solid ooze of Earth's core.

But charting the precise movements of those plates over all that time is challenging; existing models are often piecemeal, span only a few million years, or focus on just continental or oceanic changes, not both.

Now, for the first time, a group of geologists have offered up an easily digestible peek at 1 billion years of plate tectonic motion.

The geoscientists, from the University of Sydney, spent four years reconstructing how landmasses and oceans changed over the last billion years. As part of a recent study, they animated those changes into the short video below.


The animation shows green continents lumbering across oceans, which are represented in white. The Ma at the top of the video is geologic speak for 1 million - so 1,000 Ma is 1 billion years ago. The various color lines represent different types of boundaries between tectonic plates: Blue-purple lines represent divergent boundaries, where plates split apart; red triangles indicate convergent boundaries, where plates move together; and grey-green curves show transform boundaries, where plates slide sideways past each other.

"These plates move at the speed fingernails grow, but when a billion years is condensed into 40 seconds, a mesmerizing dance is revealed," Sabin Zahirovic, a University of Sydney geologist who co-authored the new study, said in a press release.

Building a better model of Earth's plates
Pangea map
A map shows what Pangea looked like 200 million years ago, with tectonic plate boundaries in white. Wikimedia Commons
The Earth formed 4.4 billion years ago, and then it cooled down enough to form a solid crust with individual plates roughly 1.2 billion years after that.

Today, one can imagine the planet as a chocolate truffle - a viscous center ensconced in a hardened shell. The center consists of a 1,800-mile-thick, semi-solid mantle that encircles a super-hot core. The top layer - only about 21 miles thick - is the crust, which is fragmented into tectonic plates that fit together.

These plates surf atop the mantle, moving around as hotter, less dense material from deep within the Earth rises toward to the crust, and colder, denser material sinks towards the core.

Geologists can piece together a picture of which plates were where hundreds of millions of years ago by analyzing what's known as paleomagnetic data. When lava at the junction of two tectonic plates cools, some of the resulting rock contains magnetic minerals that align with the directions of Earth's magnetic poles at the time the rock solidified. Even after the plates containing those rocks have moved, researchers can study that magnetic alignment to parse out where on the global map those natural magnets existed in the past.

Using both paleomagnetics and current tectonic plate data, the study authors were able to create the most thorough map of each plate's journey from 1 billion years ago until the present.

midatlantic_mdl_2014_bathy_lrg 2
A map of the Atlantic Ocean floor. NASA Earth Observatory maps by Joshua Stevens, using data from Sandwell, D. et al. (2014)
"Simply put, this complete model will help explain how our home, planet Earth, became habitable for complex creatures," Dietmar Müller, a co-author of the study, said in a press release.

The jigsaw puzzle of Earth's continents hasn't stopped shifting, of course. The Pacific Ocean, for example, is shrinking year by year. The Atlantic, meanwhile, is widening - pushing the Americas away from Africa and Europe.

https://currently.att.yahoo.com/news/wa ... 00244.html
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Meet Elizabeth Ann, the First Cloned Black-Footed Ferret

Her birth represents the first cloning of an endangered species native to North America, and may bring needed genetic diversity to the species.


Watch videos here

https://www.nytimes.com/2021/02/18/scie ... iversified

Last year, Ben Novak drove across the country to spend New Year’s Eve with a black-footed ferret. Elizabeth Ann had just turned 21 days old — surely a milestone for any ferret but a particularly meaningful one for Elizabeth Ann, the first of any native, endangered animal species in North America to be cloned.

Mr. Novak, the lead scientist of the biotechnology nonprofit Revive & Restore, bought a trailer camper to drive his wife and identical twin toddlers from North Carolina to the National Black-footed Ferret Conservation Center near Fort Collins, Colo. (They made one pit stop in Texas to see Kurt, the first cloned Przewalski’s horse.)

Mr. Novak spent less than 15 minutes with Elizabeth Ann, whose black mask, feet and tail were just beginning to show through her downy white fur. “It felt like time stopped,” Mr. Novak said.

Thankfully, time has not stopped for Elizabeth Ann, who now looks bigger, browner and considerably more like a ferret. Her successful cloning is the culmination of a yearslong collaboration with the U.S. Fish and Wildlife Service, Revive & Restore, the for-profit company ViaGen Pets & Equine, San Diego Zoo Global and the Association of Zoos and Aquariums.

Cloned siblings are on the way, and potential (cloned) mates are already being lined up. If successful, the project could bring needed genetic diversity to the endangered species. And it marks another promising advance in the wider effort to use cloning to retrieve an ever-growing number of species from the brink of extinction.

The black-footed ferret, the first species to be reintroduced to former habitats with the help of artificial insemination, has long been a model species for new conservation technologies. So it is fitting that the ferrets have become the second species to be cloned for this type of genetic rescue. (Elizabeth Ann follows in the footsteps of Kurt the horse.)

“Pinch me,” joked Oliver Ryder, the director of conservation genetics at San Diego Zoo Global, over a Zoom call. “The cells of this animal banked in 1988 have become an animal.”

Videos and more...

https://www.nytimes.com/2021/02/18/scie ... iversified
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Perseverance’s Pictures From Mars Show NASA Rover’s New Home

Scientists working on the mission are eagerly scrutinizing the first images sent back to Earth by the robotic explorer.


Video, images and illustrations at:

https://www.nytimes.com/2021/02/19/scie ... 778d3e6de3

The newest Martian, a robot named Perseverance, is alive and well after its first day and night on the red planet, NASA scientists and engineers said on Friday. Members of the triumphant team managing the spacecraft were exhilarated as they shared pictures captured by its cameras during landing and after the rover reached the surface.

“As scientists, we’re used to the engineers showing us animations of the rover, and that’s at first what I thought this was,” said Katie Stack Morgan, a deputy project scientist for the mission at NASA’s Jet Propulsion Laboratory, referring to one of the pictures. “And then I did a double take and said that’s the actual rover.”

The picture shared by NASA showed the rover during the final stages of its landing, when a piece of the spacecraft called the sky crane, which is sometimes compared to a jetpack, gently lowered the vehicle to the surface.

“We are overwhelmed with excitement and overjoyed to have successfully landed another rover on the surface of Mars,” said Adam Steltzner, the chief engineer for the rover mission.

The system was also used during the landing of the Curiosity rover in 2012 and contributed to the safe arrival of both robotic explorers on the tricky terrain of the fourth planet from the sun. After the rover set down, the sky crane flew away to a safe location where its landing would not cause any damage to the mission.

Another photograph, taken from the Mars Reconnaissance Orbiter, a NASA spacecraft that has been studying Mars from space since 2006, showed the rover hanging from its parachute as it drifted over the Martian terrain. The rover hangs over Jezero Crater, the site that NASA selected for its latest Mars landing.

The rover landed on a “pool-table-flat” spot littered with small rocks in the middle of Jezero Crater, thought to be the dry basin of a lake that existed 3.8 billion years ago. It is sitting near the edge of a rougher area of fractured terrain that the scientists have named Canyon de Chelly after the Navajo site of the same name in Arizona. The rover is about 1.25 miles from a river delta scientists think is a prime spot to look for chemical signatures of ancient microbial life.

And so the adventure begins.

Showing off a close-up image of one of the rover’s wheels surrounded by small rocks with pitted surfaces, Hallie Gengl Abarca, an engineer who leads work on the rover’s data systems, said, “One of the amazing things is now that we have this image data, now we can hand this over to the robotic and science teams.”

One of the mission’s first orders of scientific business will be to study the rocks in the crater and to work out whether they are volcanic basalt or sedimentary rocks. If the rocks are sedimentary, the area might have been habitable long ago; if they are volcanic, it will allow geologists to calculate their age.

“I think we can say these rocks are between 3.8 and 3.6 or 3.7 billion years old,” Dr. Morgan of the Jet Propulsion Lab said. “So this is the time in Mars history when water was stable on the surface of Mars, and we think this area would have been a habitable environment.”

About every two years for decades now, a diverse armada of spacecraft from Earth have been heading for Mars when the planets are fortuitously aligned. NASA was joined this month by China and the United Arab Emirates, which both sent spacecraft to study the planet. It has become part of an international scientific effort to find out if life on Earth ever had any company in the solar system. Not all of the spacecraft have made it, and even NASA experienced a number of failed missions to Mars in the 1990s.

NASA’s twin Viking spacecraft, which landed in 1976, famously dug in the sand and performed experiments looking for living microbes but came up empty, perhaps because scientists didn’t understand Martian chemistry.

The Curiosity rover, which arrived in 2012, is still roaming a place called Gale Crater. It found geological evidence of past water, the essential ingredient for life as we think we know it. Perseverance, eight years in the making, is designed to search the places where water once was for so-called biosignatures — fossils or any other evidence of once-living organisms — according to Dr. Steltzner.

“When we do such investments,” he said, “we do them for humanity, and we do them as a gesture of our humanity.”

More...

https://www.nytimes.com/2021/02/19/scie ... 778d3e6de3
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Neanderthals died out after Earth's magnetic poles flipped, causing a climate crisis 42,000 years ago, a study says
Aria Bendix
Business Insider Sat, February 20, 2021, 6:32 AM
Neanderthal Evolution
An exhibit shows a Neanderthal family at the Neanderthal Museum in Krapina, Croatia, in February 2010. Reuters/Nikola Solic
Earth's magnetic poles flipped 42,000 years ago, which may have triggered a global climate crisis, a new study found.

The resulting changes in temperatures and radiation levels may have killed off many large mammals

Scientists have known about the flip since the late 1960s. Earth's magnetic poles aren't static - they're generated by electric currents from the planet's liquid outer core, which is constantly in motion. As of late, Earth's magnetic North pole has wandered considerably on a path toward northern Russia.

But for the most part, scientists didn't think the last pole flip had a major environmental impact. Sure, the planet's magnetic field got weaker, allowing more cosmic rays to penetrate the atmosphere, but plant and animal life wasn't known to have been greatly affected.

A new study now suggests a more dramatic phenomenon occurred: The additional cosmic rays may have depleted ozone concentrations, opening the floodgates for more ultraviolet radiation in the atmosphere. Shifting weather patterns may have expanded the ice sheet over North America and dried out Australia, prompting the extinction of many large mammal species. A solar storm, meanwhile, might have driven ancient humans to seek shelter in caves.

"It would have been an incredibly scary time, almost like the end of days," Chris Turney, an Earth scientist at the University of New South Wales, said in a video describing the new research.

Scientists have not come to agree on a definitive theory about why Neanderthals disappeared. Some research suggests their extinction happened naturally, as Neanderthals inbred with modern humans or the population became too small to hunt, mate, and raise children. Other scientists have posited that Neanderthals may have been out-competed for resources as modern humans started to populate Europe.

But it's probably no coincidence that Neanderthals died out following a major shift of Earth's magnetic poles, Turney's study suggests.

"It was only when you started talking between different areas of science, you could see the connections," his co-author, Alan Cooper, said. "Before that, none of the different fields had worked out 42 [42,000 years ago] was the key event."

Ancient trees and caves hold clues about a possible climate disaster
A scientist takes measurements for the archaeo-magnetic survey in the Bruniquel Cave in southwestern France in this undated handout photo after the discovery there of mysterious ring-shaped structures fashioned about 176,500 years ago by Neanderthals. Etienne FABRE - SSAC/Handout via Reuters

To find out what happened to Earth's climate 42,000 years ago, scientists asked a native New Zealander who was alive at the time: the ancient kauri tree. The tree's rings serve as a record of radiocarbon levels - a radioactive isotope - in the atmosphere over tens of thousands of years. Indeed, the rings showed evidence of rising radiocarbon at the time when the magnetic fields flipped, an event known as the "Laschamps excursion."

The event isn't unique in the history of our planet: The British Geological Survey estimates that four or five pole flips occur every million years.

During these reversals, the magnetic shield that protects our planet from solar wind (charged particles streaming off the sun) gets weaker. Earth's magnetic North and South poles - not to be confused with the planet's northernmost and southernmost geographic points - switch places.

The Laschamps excursion, the most recent example of this magnetic flip, likely took place over a period of 1,000 years. That's a blip in Earth's lifetime, but long enough to alter the fates of those living on the planet.

"In that process of flipping from North to South and South to North, effectively the Earth's magnetic field almost disappeared," Turney said. "And it opened the planet up to all these high-energy particles from outer space."

If the sun spewed extra-high levels of radiation in a solar storm during that time, Neanderthals may have needed to seek cover.

Indeed, the Laschamps excursion coincided with a rise in cave use across Europe and Southeast Asia. In particular, researchers have found red ocher handprints in the regions' caves that date back some 40,000 years. According to the new study, this pigment could have served as an ancient form of sunscreen.

Neanderthal study
Red ocher handprints in Spain's El Castillo cave may represent the use of an ancient form of sunscreen. Paul Pettitt, Gobierno de Cantabria
Another magnetic reversal could be 'imminent'
Not all researchers are convinced by Turney and Cooper's analysis. Chris Stringer, an anthropologist at the Natural History Museum in London, told The Guardian that although the Laschamps excursion could have contributed to Neanderthals' demise, it's hard to know exactly when they died out.

"They did survive longer and ranged more widely than just Europe, and we have a very poor fix on the timing of their final disappearance across swathes of Asia," Stringer said.

James Channell, a geologist at the University of Florida, told NPR that historical records of ice cores dating back 42,000 years don't indicate a global climate crisis. Still, he added, "there does appear to be a linkage" between the extinction of large mammals and the weakening of Earth's magnetic field.

Scientists know that Earth's magnetic field has weakened about 9% in the past 170 years. The magnetic North pole has also been drifting more rapidly since the 1990s, at a rate of 30 to 40 miles per year.

This has "increased speculation that a field reversal may be imminent," the researchers wrote. Such an event could potentially topple power grids and satellite networks. An increase in radiation could also expose more people to diseases like cancer.

But scientists suspect that any possible magnetic reversal would be in its early stages. Earth's magnetic field is still far stronger than it was the last time the poles flipped.

https://currently.att.yahoo.com/news/ne ... 00659.html
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Archeologists find intact ceremonial chariot near Pompeii
COLLEEN BARRY
Associated Press Sat, February 27, 2021, 11:19 AM
A view of a chariot, with its iron elements, bronze decorations and mineralized wooden remains, that was found in Civita Giuliana, north of Pompeii. Officials at the Pompeii archaeological site near Naples on Saturday, Feb. 27, 2021, announced the first-ever discovery of an intact ceremonial chariot, one of several important discoveries made in the same area outside the park following an investigation into an illegal dig. (Parco Archeologico di Pompei via AP)
A detail of the decoration of a chariot, with its iron elements, bronze decorations and mineralized wooden remains, that was found in Civita Giuliana, north of Pompeii. Officials at the Pompeii archaeological site near Naples on Saturday, Feb. 27, 2021, announced the first-ever discovery of an intact ceremonial chariot, one of several important discoveries made in the same area outside the park following an investigation into an illegal dig. (Parco Archeologico di Pompei via AP)
1 / 2
Italy Pompeii
A view of a chariot, with its iron elements, bronze decorations and mineralized wooden remains, that was found in Civita Giuliana, north of Pompeii. Officials at the Pompeii archaeological site near Naples on Saturday, Feb. 27, 2021, announced the first-ever discovery of an intact ceremonial chariot, one of several important discoveries made in the same area outside the park following an investigation into an illegal dig. (Parco Archeologico di Pompei via AP)
MILAN (AP) — Officials at the Pompeii archaeological site in Italy announced Saturday the discovery of an intact ceremonial chariot, one of several important discoveries made in the same area outside the park near Naples following an investigation into an illegal dig.

The chariot, with its iron elements, bronze decorations and mineralized wooden remains, was found in the ruins of a settlement north of Pompeii, beyond the walls of the ancient city, parked in the portico of a stable where the remains of three horses previously were discovered.

The Archaeological Park of Pompeii called the chariot “an exceptional discovery” and said "it represents a unique find - which has no parallel in Italy thus far - in an excellent state of preservation.”

The eruption of Mount Vesuvius in 79 AD destroyed Pompeii. The chariot was spared when the walls and roof of the structure it was in collapsed, and also survived looting by modern-day antiquities thieves, who had dug tunnels through to the site, grazing but not damaging the four-wheeled cart, according to park officials.

The chariot was found on the grounds of what is one of the most significant ancient villas in the area around Vesuvius, with a panoramic view of the Mediterranean Sea. on the outskirts of the ancient Roman city.

Archaeologists last year found in the same area on the outskirts of Pompeii, Civita Giulian, the skeletal remains of what are believed to have been a wealthy man and his male slave, attempting to escape death.

The chariot's first iron element emerged on Jan. 7 from the blanket of volcanic material filling the two-story portico. Archaeologists believe the cart was used for festivities and parades, perhaps also to carry brides to their new homes.

While chariots for daily life or the transport of agricultural products have been previously found at Pompeii, officials said the new find is the first ceremonial chariot unearthed in its entirety.

The villa was discovered after police came across the illegal tunnels in 2017, officials said. Two people who live in the houses atop the site are currently on trial for allegedly digging more than 80 meters of tunnels at the site.

https://currently.att.yahoo.com/news/ar ... 49440.html
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Once Upon a Time on Mars

A dune buggy is about to set off on behalf of its human owners to fulfill a primordial yearning.


Three and a half billion years ago, waves splashed and streams surged across this dusty expanse on Mars now known as Jezero Crater. On a nascent Earth, chemistry was coagulating toward the exalted state we call life.

Astronomers, philosophers and science fiction writers have long wondered whether nature ran the same experiment there as on Earth. Was Mars another test tube for Darwinian evolution? No longer will you be laughed out of biology class for speculating that life actually evolved on Mars first and drifted to Earth on a meteorite, or that both planets were seeded with microbes or proto-life from somewhere even farther way.

So humans have sent their progeny across time and 300 million miles of space in search of long-lost relatives, ancient roots of a family tree that might be traced in the Red Planet’s soil.

The Perseverance rover and its little sibling, the Ingenuity helicopter, landed in a cloud of grit on Feb. 18, bristling with antennas and cameras. Perseverance will spend the next Martian year — the equivalent of two Earth years — prowling, poking and collecting rocks from Jezero Crater and the river delta that enters it. The rover will scrutinize the debris chemically and geologically, and take photographs, so that scientists on Earth can search for any signs of ancient fossilization or other patterns that living organisms might have produced.

Perseverance and Ingenuity operate on very long leashes: 12 minutes of light-travel time — and signal delay — across the ether from Pasadena, where their creators and tenders wait to see what they have accomplished lately. Like the teenagers you let out the door with the car keys, Perseverance and Ingenuity are no more intelligent and responsible than humans have trained them to be.

Those rocks will be picked up and returned to Earth in a five-year series of maneuvers involving relay rockets, rovers and orbital transfers starting in 2026 that will make the retrieval of the moon rocks look as easy as shipping holiday cookies to your relatives. The rocks that return starting in 2031 will be scrutinized for years, like the Dead Sea Scrolls, for what they might say about the hidden history of our lost twin and, perhaps, the earliest days of life in the Solar System.

The generation that followed World War II carried out the first great reconnaissance of the solar system. It could be the destiny of this generation to carry out the next great reconnaissance, to discover if we have or ever had any neighbors on these worlds. In Jezero Crater, the dream lives on. We may not ever live on Mars, but our machines already do.

Photos and video at:

https://www.nytimes.com/2021/03/02/scie ... iversified
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

That Is Not How Your Brain Works

Forget these scientific myths to better understand your brain and yourself.


The 21st century is a time of great scientific discovery. Cars are driving themselves. Vaccines against deadly new viruses are created in less than a year. The latest Mars Rover is hunting for signs of alien life. But we’re also surrounded with scientific myths: outdated beliefs that make their way regularly into news stories.

Being wrong is a normal and inevitable part of the scientific process. We scientists do our best with the tools we have, until new tools extend our senses and let us probe more deeply, broadly, or precisely. Over time, new discoveries lead us to major course corrections in our understanding of how the world works, such as natural selection and quantum physics. Failure, therefore, is an opportunity to discover and learn.1

But sometimes, old scientific beliefs persist, and are even vigorously defended, long after we have sufficient evidence to abandon them. As a neuroscientist, I see scientific myths about the brain repeated regularly in the media and corners of academic research. Three of them, in particular, stand out for correction. After all, each of us has a brain, so it’s critical to understand how that three-pound blob between your ears works.

Myth number one is that specific parts of the human brain have specific psychological jobs. According to this myth, the brain is like a collection of puzzle pieces, each with a dedicated mental function. One puzzle piece is for vision, another is for memory, a third is for emotions, and so on. This view of the brain became popular in the 19th century, when it was called phrenology. Its practitioners believed they could discern your personality by measuring bumps on your skull. Phrenology was discredited by better data, but the general idea was never fully abandoned.2

Today, we know the brain isn’t divided into puzzle pieces with dedicated psychological functions. Instead, the human brain is a massive network of neurons.3 Most neurons have multiple jobs, not a single psychological purpose.4 For example, neurons in a brain region called the anterior cingulate cortex are regularly involved in memory, emotion, decision-making, pain, moral judgments, imagination, attention, and empathy.

I’m not saying that every neuron can do everything, but most neurons do more than one thing. For example, a brain region that’s intimately tied to the ability to see, called primary visual cortex, also carries information about hearing, touch, and movement.5 In fact, if you blindfold people with typical vision for a few days and teach them to read braille, neurons in their visual cortex become more devoted to the sense of touch.6 (The effect disappears in a day or so without the blindfold.)

In addition, the primary visual cortex is not necessary for all aspects of vision. Scientists have believed for a long time that severe damage to the visual cortex in the left side of your brain will leave you unable to see out of your right eye, assuming that the ability to see out of one eye is largely due to the visual cortex on the opposite side. Yet more than 50 years ago, studies on cats with cortical blindness on one side showed that it is possible to restore some of the lost sight by cutting a connection deep in the cat’s midbrain. A bit more damage allowed the cats to orient toward and approach moving objects.

Perhaps the most famous example of puzzle-piece thinking is the “triune brain”: the idea that the human brain evolved in three layers. The deepest layer, known as the lizard brain and allegedly inherited from reptile ancestors, is said to house our instincts. The middle layer, called the limbic system, allegedly contains emotions inherited from ancient mammals. And the topmost layer, called the neocortex, is said to be uniquely human—like icing on an already baked cake—and supposedly lets us regulate our brutish emotions and instincts.

This compelling tale of brain evolution arose in the mid 20th century, when the most powerful tool for inspecting brains was an ordinary microscope. Modern research in molecular genetics, however, has revealed that the triune brain idea is a myth. Brains don’t evolve in layers, and all mammal brains (and most likely, all vertebrate brains as well) are built from a single manufacturing plan using the same kinds of neurons.

Nevertheless, the triune brain idea has tremendous staying power because it provides an appealing explanation of human nature. If bad behavior stems from our inner beasts, then we’re less responsible for some of our actions. And if a uniquely human and rational neocortex controls those beasts, then we have the most highly evolved brain in the animal kingdom. Yay for humans, right? But it’s all a myth. In reality, each species has brains that are uniquely and effectively adapted to their environments, and no animal brain is “more evolved” than any other.

So why does the myth of a compartmentalized brain persist? One reason is that brain-scanning studies are expensive. As a compromise, typical studies include only enough scanning to show the strongest, most robust brain activity. These underpowered studies produce pretty pictures that appear to show little islands of activity in a calm-looking brain. But they miss plenty of other, less robust activity that may still be psychologically and biologically meaningful. In contrast, when studies are run with enough power, they show activity in the majority of the brain.7

Another reason is that animal studies sometimes focus on one small part of the brain at a time, even just a few neurons. In pursuit of precision, they wind up limiting their scope to the places where they expect to see effects. When researchers instead take a more holistic approach that focuses on all the neurons in a brain—say, in flies, worms, or even mice—the results show more what looks like whole-brain effects.8

Pretty much everything that your brain creates, from sights and sounds to memories and emotions, involves your whole brain. Every neuron communicates with thousands of others at the same time. In such a complex system, very little that you do or experience can be traced to a simple sum of parts.

Myth number two is that your brain reacts to events in the world. Supposedly, you go through your day with parts of your brain in the off position. Then something happens around you, and those parts switch on and “light up” with activity.

Brains, however, don’t work by stimulus and response. All your neurons are firing at various rates all the time. What are they doing? Busily making predictions.9 In every moment, your brain uses all its available information (your memory, your situation, the state of your body) to take guesses about what will happen in the next moment. If a guess turns out to be correct, your brain has a head start: It’s already launching your body’s next actions and creating what you see, hear, and feel. If a guess is wrong, the brain can correct itself and hopefully learn to predict better next time. Or sometimes it doesn’t bother correcting the guess, and you might see or hear things that aren’t present or do something that you didn’t consciously intend. All of this prediction and correction happens in the blink of an eye, outside your awareness.

If a predicting brain sounds like science fiction, here’s a quick demonstration. What is this picture?
Image
Barrett_BREAKER-2
If you see only some curvy lines, then your brain is trying to make a good prediction and failing. It can’t match this picture to something similar in your past. (Scientists call this state “experiential blindness.”) To cure your blindness, visit lisafeldmanbarrett.com/nautilus and read the description, then come back here and look at the picture again. Suddenly, your brain can make meaning of the picture. The description gave your brain new information, which conjured up similar experiences in your past, and your brain used those experiences to launch better predictions for what you should see. Your brain has transformed ambiguous, curvy lines into a meaningful perception. (You will probably never see this picture as meaningless again.)
Predicting and correcting is a more efficient way to run a system than constantly reacting in an uncertain world. This is clear every time you watch a baseball game. When the pitcher hurls the ball at 96 miles per hour toward home plate, the batter doesn’t have enough time to wait for the ball to come close, consciously see it, and then prepare and execute the swing. Instead, the batter’s brain automatically predicts the ball’s future location, based on rich experience, and launches the swing based on that prediction, to be able to have a hope of hitting the ball. Without a predicting brain, sports as we know them would be impossible to play.

What does all this mean for you? You’re not a simple stimulus-response organism. The experiences you have today influence the actions that your brain automatically launches tomorrow.

The third myth is that there’s a clear dividing line between diseases of the body, such as cardiovascular disease, and diseases of the mind, such as depression. The idea that body and mind are separate was popularized by the philosopher René Descartes in the 17th century (known as Cartesian dualism) and it’s still around today, including in the practice of medicine. Neuroscientists have found, however, that the same brain networks responsible for controlling your body also are involved in creating your mind.10 A great example is the anterior cingulate cortex, which I mentioned earlier. Its neurons not only participate in all the psychological functions I listed, but also they regulate your organs, hormones, and immune system to keep you alive and well.

Every mental experience has physical causes, and physical changes in your body often have mental consequences, thanks to your predicting brain. In every moment, your brain makes meaning of the whirlwind of activity inside your body, just as it does with sense data from the outside world. That meaning can take different forms. If you have tightness in your chest that your brain makes meaningful as physical discomfort, you’re likely to visit a cardiologist. But if your brain makes meaning of that same discomfort as distress, you’re more likely to book time with a psychiatrist. Note that your brain isn’t trying to distinguish two different physical sensations here. They are pretty much identical, and an incorrect prediction can cost you your life. Personally, I have three friends whose mothers were misdiagnosed with anxiety11 when they had serious illnesses, and two of them died.

When it comes to illness, the boundary between physical and mental is porous. Depression is usually catalogued as a mental illness, but it’s as much a metabolic illness as cardiovascular disease, which itself has significant mood-related symptoms. These two diseases occur together so often that some medical researchers believe that one may cause the other. That perspective is steeped in Cartesian dualism. Both depression12 and cardiovascular disease13 are known to involve problems with metabolism, so it’s equally plausible that they share an underlying cause.

When thinking about the relationship between mind and body, it’s tempting to indulge in the myth that the mind is solely in the brain and the body is separate. Under the hood, however, your brain creates your mind while it regulates the systems of your body. That means the regulation of your body is itself part of your mind.

Science, like your brain, works by prediction and correction. Scientists use their knowledge to fashion hypotheses about how the world works. Then they observe the world, and their observations become evidence they use to test the hypotheses. If a hypothesis did not predict the evidence, then they update it as needed. We’ve all seen this process in action during the pandemic. First we heard that COVID-19 spread on surfaces, so everyone rushed to buy Purell and Clorox wipes. Later we learned that the virus is mainly airborne and the focus moved to ventilation and masks. This kind of change is a normal part of science: We adapt to what we learn. But sometimes hypotheses are so strong that they resist change. They are maintained not by evidence but by ideology. They become scientific myths.

https://nautil.us/issue/98/mind/that-is ... b00bf1f6eb
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

At Dubai airport, travelers' eyes become their passports
ISABEL DEBRE
Associated Press Sun, March 7, 2021, 10:56 AM

A woman enters the face and iris-recognition gate to board a plane, during a media tour at Dubai Airport, in the United Arab Emirates, Sunday, March 7, 2021. Dubai's airport, the world’s busiest for international travel, has introduced an iris-scanner that verifies one’s identity and eliminates the need for any human interaction when leaving the country. It’s the latest artificial intelligence program the UAE has launched amid the surging coronavirus pandemic.

A man goes through an automated gate after he passed through the face and iris-recognition gate to board a plane, during a media tour at Dubai Airport, in the United Arab Emirates, Sunday, March 7, 2021. Dubai's airport, the world’s busiest for international travel, has introduced an iris-scanner that verifies one’s identity and eliminates the need for any human interaction when leaving the country. It’s the latest artificial intelligence program the UAE has launched amid the surging coronavirus pandemic.

Face and iris-recognition gates are shown during a media tour at Dubai airport, United Arab Emirates, Sunday, March 7, 2021. Dubai airport offers the latest artificial intelligence program amid the surging coronavirus pandemic, with contact-less technology the government promotes as helping to stem the spread of the virus. But the efforts also have renewed questions about mass surveillance in the federation of seven sheikhdoms, which experts believe has among the highest per capita concentrations of surveillance cameras in the world.

Emirati officials take part in a media tour presenting the new face and iris-recognition facilities at Dubai airport, United Arab Emirates, Sunday, March 7, 2021. Dubai's airport, the world’s busiest for international travel, has introduced an iris-scanner that verifies one’s identity and eliminates the need for any human interaction when leaving the country.

DUBAI, United Arab Emirates (AP) — Dubai’s airport, the world’s busiest for international travel, can already feel surreal, with its cavernous duty-free stores, artificial palm trees, gleaming terminals, water cascades and near-Arctic levels of air conditioning.

Now, the key east-west transit hub is rolling out another addition from the realm of science fiction — an iris-scanner that verifies one’s identity and eliminates the need for any human interaction when entering or leaving the country.

It’s the latest artificial intelligence program the United Arab Emirates has launched amid the surging coronavirus pandemic, contact-less technology the government promotes as helping to stem the spread of the virus. But the efforts also have renewed questions about mass surveillance in the federation of seven sheikhdoms, which experts believe has among the highest per capita concentrations of surveillance cameras in the world.

Dubai's airport started offering the program to all passengers last month. On Sunday, travelers stepped up to an iris scanner after checking in, gave it a good look and breezed through passport control within seconds. Gone were the days of paper tickets or unwieldy phone apps.

In recent years, airports across the world have accelerated their use of timesaving facial recognition technology to move passengers to their flights. But Dubai's iris scan improves on the more commonplace automated gates seen elsewhere, authorities said, connecting the iris data to the country's facial recognition databases so the passenger needs no identifying documents or boarding pass. The unusual partnership between long-haul carrier Emirates, owned by a Dubai sovereign wealth fund, and the Dubai immigration office integrates the data and carries travelers from check-in to boarding in one fell swoop, they added.

“The future is coming," said Major Gen. Obaid Mehayer Bin Suroor, deputy director of the General Directorate of Residency and Foreign Affairs. “Now, all the procedures have become ‘smart,' around five to six seconds.”

But like all facial recognition technology, the program adds to fears of vanishing privacy in the country, which has faced international criticism for targeting journalists and human rights activists.

According to Emirates' biometric privacy statement, the airline links passengers' faces with other personally identifying data, including passport and flight information, retaining it for “as long as it is reasonably necessary for the purposes for which it was collected.” The agreement offered few details about how the data will be used and stored, beyond saying that while the company didn't make copies of passengers' faces, other personal data “can be processed in other Emirates' systems.”

Bin Suroor stressed that Dubai's immigration office “completely protects” passengers' personal data so that “no third party can see it.”

But without more information about how data will be used or stored, biometric technology raises the possibility of misuse, experts say.

“Any kind of surveillance technology raises red flags, regardless of what kind of country it’s in,” said Jonathan Frankle, a doctoral student in artificial intelligence at the Massachusetts Institute of Technology. ”But in a democratic country, if the surveillance technology is used transparently, at least there’s an opportunity to have a public conversation about it."

Iris scans, requiring people to stare into a camera as though they're offering a fingerprint, have become more widespread worldwide in recent years as questions have arisen over the accuracy of facial recognition technology. Iris biometrics are considered more reliable than surveillance cameras that scan people's faces from a distance without their knowledge or consent.

Despite concerns about overzealous surveillance in the UAE, the country's vast facial recognition network only shows signs of expanding. Last month, Prime Minister Sheikh Mohammed bin Rashid Al Maktoum, who also serves as Dubai's ruler, announced the country would begin trials of new facial recognition technology to cut down on paperwork in “some private sector services,” without elaborating.

During the pandemic, the skyscrapper-studded city of Dubai has advanced an array of technological tools to fight the virus in malls and on streets, including disinfectant foggers, thermal cameras and face scans that check for masks and take temperatures. The programs similarly use cameras that can record and upload people's data, potentially feeding the information into the city-state's wider biometric databases.

https://currently.att.yahoo.com/news/du ... 04276.html
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Can the World Learn From South Africa’s Vaccine Trials?

Vaccine trials are often done in wealthier countries. Scientists say the South Africa experience proves the value of trials in the global south.


In a year that has seesawed between astonishing gains and brutal setbacks on Covid-19, few moments were as sobering as the revelation last month that a coronavirus variant in South Africa was dampening the effect of one of the world’s most potent vaccines.

That finding — from a South African trial of the Oxford-AstraZeneca shot — exposed how quickly the virus had managed to dodge human antibodies, ending what some researchers have described as the world’s honeymoon period with Covid-19 vaccines and setting back hopes for containing the pandemic.

As countries adjust to that jarring turn of fortune, the story of how scientists uncovered the dangers of the variant in South Africa has put a spotlight on the global vaccine trials that were indispensable in warning the world.

“Historically, people might have thought a problem in a country like South Africa would stay in South Africa,” said Mark Feinberg, the chief executive of IAVI, a nonprofit scientific research group. “But we’ve seen how quickly variants are cropping up all around the world. Even wealthy countries have to pay a lot of attention to the evolving landscape all around the world.”

Once afterthoughts in the vaccine race, those global trials have saved the world from sleepwalking into year two of the coronavirus, oblivious to the way the pathogen could blunt the body’s immune response, scientists said. They also hold lessons about how vaccine makers can fight new variants this year and redress longstanding health inequities.

The deck is often stacked against medicine trials in poorer countries: Drug and vaccine makers gravitate to their biggest commercial markets, often avoiding the expense and the uncertainty of testing products in the global south. Less than 3 percent of clinical trials are held in Africa.

Yet the emergence of new variants in South Africa and Brazil has shown that vaccine makers cannot afford to wait years, as they often used to, before testing whether shots made for rich countries work in poorer ones, too.

“If you don’t identify and react to what’s happening in some supposedly far-flung continent, it significantly impacts global health,” said Clare Cutland, a vaccine scientist at the University of the Witwatersrand in Johannesburg, who coordinated the Oxford trial. “These results highlighted to the world that we’re not dealing with a single pathogen that sits there and does nothing — it’s constantly mutating.”

More...

https://www.nytimes.com/2021/03/13/worl ... 778d3e6de3
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Pandemic and the Limits of Science

What have we learned from the year that lasted a century?


“The pandemic which has just swept round the earth has been without precedent.”

So noted a May 1919 article in the journal Science, “The Lessons of the Pandemic.” The author, Maj. George A. Soper, was an American civil and sanitation engineer who, among other accomplishments, had devised a plan for ventilating New York’s subway system. He was famous for having linked, in 1904, a series of typhoid fever outbreaks to a cook named Mary Mallon who was herself immune to the disease: Typhoid Mary, the first asymptomatic superspreader known to modern science.

The pandemic, of course, was the Spanish flu of 1918-1919, which caused 50 million deaths worldwide, including 675,000 in the United States. Scientists had no idea what had hit them, Soper wrote: “The most astonishing thing about the pandemic was the complete mystery which surrounded it.” Viruses were still unknown; the illness was clearly respiratory — pneumonia was a common result — but the culprit was thought to be bacterial. (The actual pathogen, an H1N1 influenza A virus, was not identified until the 1990s.)

“Nobody seemed to know what the disease was, where it came from or how to stop it,” Soper wrote. “Anxious minds are inquiring today whether another wave of it will come again.”

The pandemic currently underway could hardly be more transparent by comparison. Within weeks of the first cases of Covid-19, in Wuhan, scientists had identified the pathogen as a novel coronavirus, named it SARS-CoV-2, sequenced its genome and shared the data with labs around the world. Its every mutation and variant is tracked. We know how the virus spreads, who among us is more vulnerable and what simple precautions can be taken against it. Not one but several highly effective vaccines were developed in record time.

So perhaps one clear lesson of our pandemic is that, when allowed, science works. Not flawlessly, and not always at a pace suited to a global emergency. The Centers for Disease Control and Prevention was slow to recognize the coronavirus as an airborne threat. Even now, medicine has a better grasp of how to prevent coronavirus infection — masks, social distancing, vaccination — than how to treat it. But even this is edifying. The public has been able to watch science at its messy, iterative, imperfect best, with researchers scrambling to draw conclusions in real time from growing heaps of data. Never has science been so evidently a process, more muscle than bone.

And yet still the virus spread. Travel restrictions, school closures, stay-at-home orders. Illness and isolation, anxiety and depression. Loss after loss after loss: of dear friends and family members, of employment, of the simple company of others. Last week, the C.D.C. concluded that 2020 was the deadliest year in American history. For some, this past year seemed to last a century; for far too many people, this past year was their last.

So let another lesson of our pandemic be this: Science alone is not enough. It needs a champion, a pulpit, a spotlight, an audience. For months, the sound and obvious advice — wear a mask, avoid gatherings — was downplayed by government officials. Never mind the social fabric; discarding one’s mask was cast as an act of defiance and personal independence.

Read today, Soper’s essay stands out at first for its quaint medical advice. He urged his readers, sensibly, to “avoid needless crowding,” but also to “avoid tight clothes, tight shoes” and to chew one’s food thoroughly. He added, “It is not desirable to make the general wearing of masks compulsory.”

Most striking, though, are the main lessons he drew from his pandemic, which are all too applicable to ours. One, respiratory diseases are highly contagious, and even the common ones demand attention. Two, the burden of preventing their spread falls heavily on the individual. These create, three, the overarching challenge: “Public indifference,” Soper wrote. “People do not appreciate the risks they run.”

A hundred-plus years of medical progress later, the same obstacle remains. It is the duty of leadership, not science, to shake its citizens from indifference. Of course, indifference does not quite capture the reality of why we found it so challenging to stop congregating indoors or without masks. This pandemic has also revealed, perhaps, the power of our species’s desire to commune. We need each other, even against reason and sound public-health advice.

A week before “Lessons” appeared in 1919, Soper published another article, in the New York Medical Journal, making the case for an international health commission. “It should not be left to the vagaries of chance to encourage or stay the progress of those forms of disease, which neglected, become pestilences,” he argued. He imagined a supragovernmental agency charged with investigating and reporting the trajectory of dangerous diseases — “a live, efficient, energetic institution possessing real powers and capable of doing large things.”

He got his wish. Soper modeled his vision on the International Office of Public Health, established in Paris in 1908 and later absorbed into the United Nations World Health Organization, which was founded in April 1948, just two months before his death. But the W.H.O. could not contain Covid-19, either. Preventing the next pandemic will require far more coordination and planning within and between governments than was mustered this time, much less a century ago.

“Let us hope that the nations will see the need” and “initiate the work which so greatly requires to be done,” Soper wrote in 1919. Let us hope that, before the next pandemic comes, we will have done more than hope.

https://www.nytimes.com/2021/03/16/heal ... 778d3e6de3
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Primordial lightning strikes may have helped life emerge on Earth
Will Dunham
Reuters Tue, March 16, 2021, 11:08 AM

By Will Dunham

WASHINGTON (Reuters) - The emergence of the Earth's first living organisms billions of years ago may have been facilitated by a bolt out of the blue - or perhaps a quintillion of them.

Researchers said on Tuesday that lightning strikes during the first billion years after the planet's formation roughly 4.5 billion years ago may have freed up phosphorus required for the formation of biomolecules essential to life.

The study may offer insight into the origins of Earth's earliest microbial life - and potential extraterrestrial life on similar rocky planets. Phosphorus is a crucial part of the recipe for life. It makes up the phosphate backbone of DNA and RNA, hereditary material in living organisms, and represents an important component of cell membranes.

On early Earth, this chemical element was locked inside insoluble minerals. Until now, it was widely thought that meteorites that bombarded early Earth were primarily responsible for the presence of "bioavailable" phosphorus. Some meteorites contain the phosphorus mineral called schreibersite, which is soluble in water, where life is thought to have formed.

When a bolt of lightning strikes the ground, it can create glassy rocks called fulgurites by super-heating and sometimes vaporizing surface rock, freeing phosphorus locked inside. As a result, these fulgurites can contain schreibersite.

The researchers estimated the number of lightning strikes spanning between 4.5 billion and 3.5 billion years ago based on atmospheric composition at the time and calculated how much schreibersite could result. The upper range was about a quintillion lightning strikes and the formation of upwards of 1 billion fulgurites annually.

Phosphorus minerals arising from lightning strikes eventually exceeded the amount from meteorites by about 3.5 billion years ago, roughly the age of the earliest-known fossils widely accepted to be those of microbes, they found.

"Lightning strikes, therefore, may have been a significant part of the emergence of life on Earth," said Benjamin Hess, a Yale University graduate student in earth and planetary sciences and lead author of the study published in the journal Nature Communications.

"Unlike meteorite impacts which decrease exponentially through time, lightning strikes can occur at a sustained rate over a planet's history. This means that lightning strikes also may be a very important mechanism for providing the phosphorus needed for the emergence of life on other Earth-like planets after meteorite impacts have become rare," Hess added.

The researchers examined an unusually large and pristine fulgurite sample formed when lightning struck the backyard of a home in Glen Ellyn, Illinois, outside Chicago. This sample illustrated that fulgurites harbor significant amounts of schreibersite.

"Our research shows that the production of bioavailable phosphorus by lightning strikes may have been underestimated and that this mechanism provides an ongoing supply of material capable of supplying phosphorous in a form appropriate for the initiation of life," said study co-author Jason Harvey, a University of Leeds associate professor of geochemistry.

Among the ingredients considered necessary for life are water, carbon, hydrogen, nitrogen, oxygen, sulfur and phosphorus, along with an energy source.

Scientists believe the earliest bacteria-like organisms arose in Earth's primordial waters, but there is a debate over when this occurred and whether it unfolded in warm and shallow waters or in deeper waters at hydrothermal vents.

"This model," Hess said, referring to phosphorous unlocked by lightning, "is applicable to only the terrestrial formation of life such as in shallow waters. Phosphorus added to the ocean from lightning strikes would probably be negligible given its size."

(Reporting by Will Dunham, Editing by Rosalba O'Brien)

https://currently.att.yahoo.com/news/pr ... 01488.html
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Scientists Grow Mouse Embryos in a Mechanical Womb

Biologists have long held that a fetus needs a living uterus to develop. Maybe not anymore.


The mouse embryos looked perfectly normal. All their organs were developing as expected, along with their limbs and circulatory and nervous systems. Their tiny hearts were beating at a normal 170 beats per minute.

But these embryos were not growing in a mother mouse. They were developed inside an artificial uterus, the first time such a feat has been accomplished, scientists reported on Wednesday.

The experiments, at the Weizmann Institute of Science in Israel, were meant to help scientists understand how mammals develop and how gene mutations, nutrients and environmental conditions may affect the fetus. But the work may one day raise profound questions about whether other animals, even humans, should or could be cultured outside a living womb.

In a study published in the journal Nature, Dr. Jacob Hanna described removing embryos from the uteruses of mice at five days of gestation and growing them for six more days in artificial wombs.

At that point, the embryos were about halfway through their development; full gestation is about 20 days. A human at this stage of development would be called a fetus. To date, Dr. Hanna and his colleagues have grown more than 1,000 embryos in this way.

“It really is a remarkable achievement,” said Paul Tesar, a developmental biologist at Case Western Reserve University School of Medicine.

Alexander Meissner, director of genome regulation at the Max Planck Institute for Molecular Genetics in Berlin, said that “getting this far is amazing” and that the study was “a major milestone.”

But the research has already progressed beyond what the investigators described in the paper. In an interview, Dr. Hanna said he and his colleagues had taken fertilized eggs from the oviducts of female mice just after fertilization — at Day 0 of development — and had grown them in the artificial uterus for 11 days.

Until now, researchers were able to fertilize eggs from mammals in the laboratory and grow them for only a short time. The embryos needed a living womb. “Placental mammals develop locked away in the uterus,” Dr. Tesar said.

That prevented scientists from answering fundamental questions about the earliest stages of development.

“The holy grail of developmental biology is to understand how a single cell, a fertilized egg, can make all of the specific cell types in the human body and grow into 40 trillion cells,” Dr. Tesar said. “Since the beginning of time, researchers have been trying to develop ways to answer this question.”

The only way to study the development of tissues and organs was to turn to species like worms, frogs and flies that do not need a uterus, or to remove embryos from the uteruses of experimental animals at varying times, providing glimpses of development more like snapshots than video.

What was needed was a way to get inside the uterus, watching and tweaking development in mammals as it happened. For Dr. Hanna, that meant developing an artificial uterus.

He spent seven years developing a two-part system that includes incubators, nutrients and a ventilation system. The mouse embryos are placed in glass vials inside incubators, where they float in a special nutrient fluid.

The vials are attached to a wheel that slowly spins so the embryos do not attach to the wall, where they would become deformed and die. The incubators are connected to a ventilation machine that provides oxygen and carbon dioxide to the embryos, controlling the concentration of those gasses, as well as the gas pressure and flow rate.

At Day 11 of development — more than halfway through a mouse pregnancy — Dr. Hanna and his colleagues examined the embryos, only the size of apple seeds, and compared them to those developing in the uteruses of living mice. The lab embryos were identical, the scientists found.

By that time, though, the lab-grown embryos had become too large to survive without a blood supply. They had a placenta and a yolk sack, but the nutrient solution that fed them through diffusion was no longer sufficient.

Getting past that hurdle is the next goal, Dr. Hanna said in an interview. He is considering using an enriched nutrient solution or an artificial blood supply that connects to the embryos’ placentas.

In the meantime, experiments beckon. The ability to keep embryos alive and developing halfway through pregnancy “is a gold mine for us,” Dr. Hanna said.

The artificial womb may allow researchers to learn more about why pregnancies end in miscarriages or why fertilized eggs fail to implant. It opens a new window onto how gene mutations or deletions affect fetal development. Researchers may be able to watch individual cells migrate to their ultimate destinations.

The work is “a breakthrough,” said Magdalena Zernicka-Goetz, professor of biology and biological engineering at Caltech. It “opens the door to a new age of studying development in the experimental mouse model.”

A recent development provides another opportunity. Researchers have directly created mouse embryos from mouse fibroblasts — connective tissue cells — making early embryos without starting with a fertilized egg.

Combine that development with Dr. Hanna’s work, and “now you don’t need mice to study mouse embryo development,” Dr. Meissner said. Scientists may be able to make all the embryos they need from connective tissue.

If scientists could make embryos without fertilizing eggs and could study their development without a uterus, Dr. Meissner said, “you can get away from embryo destruction.” There would be no need to fertilize mouse eggs only to destroy them in the course of study.

But the work might eventually extend beyond mice. Two other papers published in Nature on Wednesday report on attempts that edge near creating early human embryos in this way. Of course, Dr. Meissner said, creation of human embryos is years away — if it is permitted at all. For now, scientists generally refrain from studying human embryos beyond 14 days of fertilization.

In the future, Dr. Tesar said, “it is not unreasonable that we might have the capacity to develop a human embryo from fertilization to birth entirely outside the uterus.”

Of course, even the suggestion of this science fiction scenario is bound to horrify many. But it is early days, with no assurance human fetuses could ever develop entirely outside the womb.

https://www.nytimes.com/2021/03/17/heal ... 778d3e6de3
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

The Trouble with Brain Scans

An aspiring cognitive scientist faces the sketchy truth about fMRI.


Image

One autumn afternoon in the bowels of UC Berkeley’s Li Ka Shing Center, I was looking at my brain. I had just spent 10 minutes inside the 3 Tesla MRI scanner, the technical name for a very expensive, very high maintenance, very magnetic brain camera. Lying on my back inside the narrow tube, I had swallowed my claustrophobia and let myself be enveloped in darkness and a cacophony of foghorn-like bleats.

At the time I was a research intern at UC Berkeley’s Neuroeconomics Lab. That was the first time I saw my own brain from an MRI scan. It was a grayscale, 3-D reconstruction floating on the black background of a computer screen. As an undergraduate who studied neuroscience, I was enraptured. There is nothing quite like a young scientist’s first encounter with an imaging technology that renders the hitherto invisible visible—magnetic resonance imaging took my breath away. I felt that I was looking not just inside my body, but into the biological recesses of my mind.

It was a strange self-image, if indeed it was one. My hair did not show up, leaving just the skull and outline of the face with a cross section of the tissues inside. Dragging my mouse, I cruised through the horizontal slices of my brain—there were the branching, root-like patterns of the cerebellum, the gaping black holes of the ventricles, and the undulating ridges of my cortex looking like snakes wiggling in the sand.

Full of excitement after my encounter with MRI, I consumed scientific papers and studied their figures, which were usually grayscale brains with bright orange and blue blobs on them indicating regions of increased activation. The following year I joined a lab at Harvard, where I started working on an experiment that used functional MRI, or fMRI, to study the brain regions involved in social decision-making. fMRI allows us to record what the brain is up to while people perform mental tasks. I committed to a senior thesis and set my future sights on a Ph.D. in cognitive science.

We seek something deeper in these pictures of blood flow in the brain.

Little did I anticipate what a scientific morass I had entered. Functional magnetic resonance imaging has transformed medicine. It allows non-invasive mapping of a patient’s brain regions to enable more accurate, precise neurosurgery,1 as well as validating pharmacological effects of potential drugs on human brains.2 But fMRI’s use in cognitive and psychological science is notoriously controversial. This is partly because the technology doesn’t directly measure neural activity but rather a proxy for it—oxygenated blood flow. It also requires a tremendous amount of data processing to sort signal from noise, data processing that requires many discretionary choices on the researcher’s part.

More...

https://nautil.us/issue/98/mind/the-tro ... b00bf1f6eb
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Sweden axes Bill Gates-funded Harvard experiment aiming to DIM THE SUN to fight climate change amid outcry from activists
2 Apr, 2021 01:17

Sweden axes Bill Gates-funded Harvard experiment aiming to DIM THE SUN to fight climate change amid outcry from activists

Sweden’s space agency has called off a geoengineering experiment to determine whether blotting out the sun with aerosols could reverse global warming. Funded by Bill Gates, the project stoked fierce opposition from eco groups.
Proposed by researchers at Harvard University, the Stratospheric Controlled Perturbation Experiment, or SCoPEx, ultimately planned to release a cloud of calcium carbonate – more commonly known as chalk dust – into the atmosphere from a high-altitude balloon to study its effects on sunlight reaching earth. The project proved too controversial, however, and on Wednesday the Swedish Space Corporation (SSC) said that a test flight set for June would not move forward.

“The scientific community is divided regarding geoengineering, including any related technology tests such as the planned technical balloon test flight from Esrange this summer,” the SSC said in a statement on Wednesday.

SSC has had dialogues this spring with both leading experts on geo-engineering and with other stakeholders, as well as with the SCoPEx Advisory Board. As a result of these dialogues and in agreement with Harvard, SSC has decided not to conduct the technical test flight planned for this summer.

Bill Gates’ Mr Burns-style attempt to dim the sun might not cool the planet, but it might well sell a few more copies of his book
Planned for the Arctic town of Kiruna, the June flight would have merely tested the balloon’s systems to pave the way for the study, but local activists have vocally opposed the initiative. In a joint letter penned last month, the Saami Council – which advocates for Sweden’s indigenous Saami people – and three environmental groups warned that the SCoPEx experiment could have “catastrophic consequences.”

While SCoPEx’s website states the experiment would pose “no significant hazard to people or the environment” and would release only a small amount of particles into the air, the Saami Council has opposed solar geoengineering in concept, saying it “essentially attempts to mimic volcanic eruptions by continuously spewing the sky with sun-dimming particles.” The activist groups also argued SCoPEx could distract from the goal of reducing carbon emissions and have “irreversible sociopolitical effects that could compromise the world’s necessary efforts to achieve zero-carbon societies.”

The experiment has received backing from Harvard’s Solar Geoengineering Research Program, whose website names billionaire climate crusader Bill Gates among its private donors, though it does not specify his contribution. Previously, a lead researcher working on SCoPEx, David Keith of Harvard, separately received at least $4.6 million from Gates for his Fund for Innovative Climate and Energy Research (FICER), according to the Guardian. Founded in 2007, FICER also funds research into solar geoengineering.

Scientists have new plan to fight global warming: Dimming the sun
Asked at a 2010 TED talk about what “emergency measures” mankind could implement to fight climate change should all else fail, Gates suggested solar geoengineering could be an “insurance policy.”

“There is a line of research on what's called geoengineering, which are various techniques that would delay the heating to buy up 20 or 30 years to get our act together. Now that's just an insurance policy, you hope that you don't need to do that,” he said, adding that the idea might be “kept in the back pocket.”

Keith, a professor of applied physics at the Harvard School of Engineering and Applied Sciences, told Reuters the SSC’s decision to scrap the experiment was “a setback,” but noted that the project could move to the US if the troubles continue in Europe. Until then, the researchers say they will continue to lobby for SCoPEx in Sweden, hoping to generate public interest and support.

The experiment could find traction in the States, as a report published last week by the National Academies of Sciences, Engineering and Medicine called on the government to devote between $100 and $200 million to solar geoengineering projects, including technologies to dim out the sun, over the next five years.

https://www.rt.com/news/519904-sweden-e ... sun-gates/
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Modern brains evolved 1.7m years ago in Africa, study finds, after examining skulls of mankind’s ancient ancestors
9 Apr, 2021 15:13

When human’s very ancient ancestors took their first steps out of Africa, some 3 million years ago, their brains looked more like those of great apes. A new study says our big brains developed only some 400,000 years later.
Using CT technology to scan what skull fossils remain from our earliest ancestors, researchers at the University of Zurich have turned conventional scientific thinking on its head, saying that our modern brains began to evolve in Africa only about 1.7 million years ago.

Prior to the Swiss research, conventional scientific thinking was that our hominid lineage arose some 2.8 million years ago, and our predecessors spread out of Africa around 2.1 million years ago.

The researchers, who published their study in the current edition of the journal Science, used CT scans to analyze replicas of the brain’s outer surface re-created from the oldest known fossils of early human skulls. The 1.77-million to 1.85-million-year-old fossils are from the Dmanisi archaeological site in Georgia, and were compared by the researchers with bones roughly two million to 70,000 years old from sites in Africa and Southeast Asia.

For their research, the Swiss scientists focused on the frontal lobes – the areas of the human brain linked with complex mental tasks such as toolmaking and language. Early hominids from Dmanisi and Africa were found to have retained a great-ape-like organization of their frontal lobe some 1.8 million years ago – long after they began moving away from Africa.

According to our analyzes, modern human brain structures only emerged 1.5 to 1.7 million years ago – in African homo populations,” said study author Christoph Zollikofer, a paleoanthropologist at the University of Zurich.

The findings reveal early humans may have possessed relatively primitive brains even after they first began dispersing from Africa. Speaking of their findings, the researchers pointed out to Science News that these more primitive populations were nonetheless capable of producing a variety of tools, as well as using animals and caring for their elders.

Their study, meanwhile, notes that the development of the more modern brain “largely coincides” with the earliest evidence of more complex “technocultural performance” in Africa, prompting the team to hypothesize that the biological and cultural changes were mutually dependent.

“It was during this period that the earliest forms of human language developed,” said Marcia Ponce de León, anthropologist and study co-author.

The researchers say that hominids with modern human-like brains appeared in Southeast Asia shortly after 1.5 million years ago, suggesting additional dispersals from Africa separate from the earlier first migration.

The study said further research was needed to say with certainty whether this second wave merged with or replaced the earlier groups.

https://www.rt.com/news/520579-modern-b ... y-fossils/
kmaherali
Posts: 25107
Joined: Thu Mar 27, 2003 3:01 pm

Post by kmaherali »

Are You Confused by Scientific Jargon? So Are Scientists

Scientific papers containing lots of specialized terminology are less likely to be cited by other researchers.


Polje, nappe, vuggy, psammite. Some scientists who study caves might not bat an eye, but for the rest of us, these terms might as well be ancient Greek.

Specialized terminology isn’t unique to the ivory tower — just ask a baker about torting or an arborist about bracts, for example. But it’s pervasive in academia, and now a team of researchers has analyzed jargon in a set of over 21,000 scientific manuscripts. They found that papers containing higher proportions of jargon in their titles and abstracts were cited less frequently by other researchers. Science communication — with the public but also among scientists — suffers when a research paper is packed with too much specialized terminology, the team concluded.

These results were published Wednesday in Proceedings of the Royal Society B.

Jargon can be a problem, but it also serves a purpose, said Hillary Shulman, a communications scientist at Ohio State University. “As our ideas become more refined, it makes sense that our concepts do too.” This language-within-a-language can be a timesaver, a way to precisely convey meaning, she said. However, it also runs the risk of starkly reminding people — even some well-educated researchers — that they aren’t “in the know.”

“It’s alienating,” said Dr. Shulman.

Two scientists recently investigated how the use of jargon affects a manuscript’s likelihood of being cited in other scientific journal articles. Such citations are an acknowledgment of a study’s importance and relevance, and they’re used to estimate a researcher’s productivity.

Alejandro Martínez, an evolutionary biologist, and Stefano Mammola, an ecologist, both at the National Research Council in Pallanza, Italy, started by collecting scientific papers. Using the Web of Science, an online platform that allows subscribers to access databases of scholarly publications, they zeroed in on 21,486 manuscripts focused on cave research.

Cave science is a particularly jargon-heavy field, Dr. Martínez said. That’s because it attracts a diverse pool of researchers, each of whom brings their own terminology. Anthropologists, geologists, zoologists and ecologists all end up meeting in caves, he said. “They like the rocks or the bugs or the human remains or the wall paintings.”

To compile a list of cave-related jargon words, Dr. Martínez combed over the glossaries of caving books and review studies. He settled on roughly 1,500 terms (including the four that appear at the beginning of this article).

Dr. Mammola then wrote a computer program to calculate the proportion of jargon words in each manuscript’s title and abstract. Papers with a higher fraction of jargon received fewer citations, the researchers found. And none of the most highly cited papers — with more than 450 citations — used jargon in their title, while almost all had abstracts where fewer than 1 percent of the words were jargon.

As citations are often viewed as a metric of academic success, jargon has a negative effect on a paper, Dr. Martínez and Dr. Mammola propose. Fewer citations can mean that a paper isn’t getting read and remembered, which is bad news for science communication overall, the team concluded.

Other researchers have found, however, that using less-common words — a form of jargon — can be beneficial. David Markowitz, a psychology of language researcher at the University of Oregon, analyzed the abstracts of nearly 20,000 proposals for funding from the National Science Foundation. His results, published in 2019, revealed that abstracts that contained fewer common words tended to garner more grant funding. “Jargon doesn’t always associate with negative outcomes,” Dr. Markowitz said.

But clear communication should always be a goal in science, said Sabine Stanley, a planetary scientist at Johns Hopkins University. “It’s important to step back and always remind yourself as a scientist: how do I describe what I’m doing to someone who is not doing this 24/7 like I am?”

Dr. Stanley recently participated in the Up-Goer Five Challenge at the annual meeting of the American Geophysical Union. Inspired by an xkcd comic explaining the Saturn V rocket in plain language by Randall Munroe (an occasional Times contributor), the event challenges participants to communicate their science using only the thousand most-common words in the English language (a text editor is available).

“It’s quite challenging,” said Dr. Stanley, who presented new results from the Mars InSight lander.

The title of her talk? “A Space Computer Named In Sight Landed on the Red World Last Year and Here Is What We Found So Far.”

https://www.nytimes.com/2021/04/09/scie ... 778d3e6de3
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

19 Apr, 2021 21:50

People trust algorithms more than other human beings, study finds. Who do they think programs those computers?
© Pixabay / geralt

No matter how many algorithmic ‘glitches’ and backdoors we find in our electronics, we can’t seem to put them down. Most of us would argue we need technology, but a growing subset trust it more than people. Why?
In an era where humans are hamstrung by fear of bias, it’s easier to delegate certain tasks to computers, especially if the task is particularly trying or difficult. Certainly it’s hard to believe a seemingly-impartial silicon chip could be anything other than immune to the social pressures of human interaction in all its forms. But someone has to program the algorithms that make these devices work, and chances are that “someone” has their own set of prejudices and preconceived notions that they don’t check at the door when they arrive at work every day.

That’s not necessarily a problem - humans are biased creatures, and the sooner we learn to live with that reality instead of hiring con artists to beat it out of us with a sledgehammer (or PR flacks to convince the world other forms of it don’t exist), the less stressful our lives will be. But computer modeling is vulnerable to deadly flaws, and grasping at silicon straws isn’t going to save us. Oxford University scientists may work day and night on ethnically-based algorithms to determine how likely an innocent individual is likely to contract a serious disease over the course of their lives, but instead of concentrating on protecting the vulnerable group against that disease, the UK government seems to be focusing on marginalizing these people, a strategy whose uselessness was already proven during World War II, to the tragedy of all involved.

Lest we believe such deadly algorithms are a thing of the past, Google is still using its “Machine Learning Fairness” algorithms to shove a warped vision of “equality” down our throats.

Humans now trust algorithms more than each other, according to new researchHumans now trust algorithms more than each other, according to new research
A study published last week by the University of Georgia found humans are relying more and more on computer algorithms “as a task gets harder.” Additionally, “that effect is stronger than the bias towards relying on advice from other people,” study supervisor Eric Bogert said in the writeup, published in Nature’s Scientific Reports.

The results are troubling, not least because once humans get started down the slippery slope of depending on computers for everything, they quickly abandon human interaction as too troublesome, too difficult, wrought with too many unseen hazards or potential offenses, and in general too much work. Reality doesn’t necessarily measure up to this viewpoint, and the real danger for humanity is where it collides with this wishful thinking.

The benefits of technology can be seen wherever we rest our eyes, and we undeniably move more smoothly through space and time. But we’ve also seen the harms it can do, whether it’s creating an entire generation of overmedicated children incapable of sitting still for more than five minutes before demanding their screens, turning their parents into twitchy addicts afflicted with (but in denial of) the same condition, and empathy-free sociopaths who simply find real life dull in comparison to the myriad options available in virtual reality and don’t understand that the figures they interact with in real life are living, breathing human beings. Many of us, it seems, have been ‘spoiled’ by technology and can no longer find joy in the comparatively slower pace of normal life.

Some might argue this is being done on purpose - humans becoming inexorably, permanently dumbed down and wedded to technology that is growing ever more expensive, complex and probing, giving whoever owns the backdoors to the beloved software of the moment an unthinkable level of access to people’s inner thoughts and motivations. Controllers of these companies can easily obtain the secret thoughts and behavioral patterns of an entire population without having to resort to time-consuming methods like interrogation, and - perhaps more important - the path from user to technology is not a one-way street. These extremely powerful corporations don’t just slurp up the ideas they find inside the user’s mind; given enough time, they can ultimately start feeding “nudges” and other suggestions to the individual, growing more powerful with regard to controlling every thought that passes through our heads the longer the user interacts with the program.

Why would a smartphone or laptop user permit such violation of their inner thoughts? Surely they’ve read enough literature about the dangers of these devices over the last several decades. Is it a form of Stockholm Syndrome? Have we learned to love our silicon captors because we believe that in voluntarily relinquishing control of our own thoughts to a circle of computerized devices we are benefiting from their digital assistance?

Because humans do realize they’re dependent on their digital drug. For an example of just how far we’ve gone down the rabbit hole, take away the iPhone from a millennial living in a trendy New York City neighborhood after a few rounds of drinks. The student will panic, utterly incapable of finding their way home without the helping hand of Siri or an Uber. Few are willing to get into the drivers’ seat of their car without Google or Siri charged up and ready to guide them. Watching a modern couple on a date is somewhat depressing as both halves of the couple spend 90% of the time gazing at their phones, rather than each other. This is all considered normal.

Some would certainly argue we’re learning to trust the bots more than humans because we benefit from a growing symbiosis with them. Certainly this seems to be the case with young ‘early adopters,’ whose popularity is measured by the number of devices they flaunt on their limbs, no matter how secretly embarrassed they must become when they look at their Amazon Halo’s stern writeup of how they gained X amount of weight this week and their conversational tone has become notably more shrill. Still, in the younger generation, preteens can rake in hundreds of thousands of dollars as they dance and lip-sync in seemingly pointless mimicry to the same 15-second clip of pop songs for millions of TikTok watchers - lest anyone think 15 minutes was too tedious.

Ultimately, our devices know us better than anyone else. It is these silicon parasites who develop their own identities based on our own, becoming a source of comfort, a literal second self to talk to when one is lonely and one hasn’t taken the time to go out into the world and develop real friends (ironically, because one has been spending too much time at home on their laptop). The “second you” never makes any demands and can be shut off when it wants too much attention, making it the perfect friend, as you always know what it’s going to do next.

This isn’t how human bonds are supposed to work — though don’t expect to learn that from watching the people around you. More and more, this is how people abuse their relationships. What, you don’t want to sit there and watch me unburden myself for three hours and ignore you for the rest of the night while we stare at a screen together? Clearly that’s your problem, not mine. That last point is the real purpose of all this up-with-robots, down-with-humans argument. It takes a truly callous individual to continue doing the wrong thing even though you know you’re treating a person like an object and preventing them from achieving their goals in life. We humans even have a saying for it: the definition of insanity is doing the same thing over and over and expecting different results. But robots and algorithms don’t have this problem, unless it’s programmed into their software. And how would such an app be marketed? "Dysfunctional Girlfriend"? Not a big money-maker, that one.

But far from solving that problem, humanity is only rushing headlong further into it. Today’s generation, raised without the ability to recognize emotional cues in their surroundings due to omnipresent mask-wearing, distance learning, and a society that encourages digital narcissism, is being born into a world wholly unprepared to teach them how to understand the signs of fear, love, hate, and other emotional indicators out in the wild.

Computers, on the other hand, don’t have such pesky emotional underlayers for their interlocutor to translate - what you see is what you get. Even humor is something beyond the range of Alexa or Siri’s capability - AI can’t understand or tell jokes, though it can repeat quips someone else tells them without understanding the humor within. On the bright side, this is what sets us humans apart from the silicon brutes - just try having a chuckle with Alexa, and you’ll be running to find a human friend in minutes - only to meet hordes of half-human, half-Alexas all solicitously asking you “what’s wrong?” without caring at all for the answer, like something out of Invasion of the Body Snatchers.

Putting one’s trust in a computer may be the route toward certitude, predictability, and security. In a world where humans are conditioned to reach for the path of least resistance, many of us may want to skip over the nuances of learning human emotion - after all, aren’t humans those nasty things that spread disease and spill things and otherwise demand attention? Sounds too much like work. But if we’re truly trusting computers more than humans, that means we’re jettisoning one of the key puzzle pieces that make up our consciousness. Before we throw it away, we might want to reconsider what a tedious life a Siri or an Alexa truly lives - when humans aren’t entertaining it, that is. Be careful what you wish for.

https://www.rt.com/op-ed/521530-people- ... ters-bias/
swamidada
Posts: 1436
Joined: Sun Aug 02, 2020 8:59 pm

Post by swamidada »

Every Penguin in the World Comes from Earth's Lost Eighth Continent
Caroline Delbert
Popular Mechanics Fri, October 2, 2020,

The oldest known crested penguin fossils found in New Zealand point to a much older species.

Researchers love New Zealand's fossil record of penguins, including giant "monster penguins" that are almost 6 feet tall.

New Zealand is the last traces of a giant continent called Zealandia, which sank about 60 million years ago.

Researchers have found fossils they say determine almost conclusively that every penguin on Earth originally came from modern-day New Zealand. The small landmass we see today is only the topmost points of a sunken landmass once known as continental Zealandia, backnamed from the Dutch imperialist name for New Zealand: Nieuw Zeeland, not to be confused with the human-built micronation of Sealand.

Zealandia sank about 60 million years ago, meaning it stayed afloat a little longer than the dinosaurs, at least. Penguins date back to about 62 million years ago, and like the Darwinian winners they are, they traveled with the higher ground and survived the fall of Zealandia.

Today, researchers have fossils in hand that date back 3 million years and indicate a previously unknown ancient penguin species. And where penguin species now have wider beaks and jaws, this newly discovered crested penguin shows bone structure indicating a different diet.

“That deep bills arose so late in the greater than 60 million year evolutionary history of penguins suggests that dietary shifts may have occurred as wind-driven Pliocene upwelling radically restructured southern ocean ecosystems,” the researchers, from New Zealand and the U.S., explain in their paper.

Locals first found the fossils and alerted the researchers, who began to excavate and study the penguin specimens. While the specific fossils here are 3 million years old, the scientists say the body of evidence suggests their ancestors date back to that same time frame—60 million years. And even though 3 million years old sounds young compared to 60, the crested penguin fossils still have a beak that shows their different diet compared to today’s penguins.

The fossils are tightly dated, meaning scientists can pinpoint them to a relatively small window of time based on the rock composition surrounding them. And using a biological geography software tool called BioGeoBEARS (BEARS stands for “Bayesian Evolutionary Analysis with R Scripts,” referring to the formulae and programming language in play), these scientists have introduced the 3-million-year-old crested penguin fossil into the larger—and exclusive—evolution of original penguin species on Zealandia.

This, the researchers told Business Insider, represents a huge change from the previous “oldest” penguin fossils on record:

“[E]arlier studies had only dated the presence of crested penguins on New Zealand back about 7,000 years. The new timeline suggests the region is the penguin’s most likely place of origin.”

The finding isn't necessarily a surprise, but the fossil record is spotty in a way that means not all logically sound theses end up being substantiated by evidence. The researchers write:

“New Zealand is a globally significant hotspot for seabird diversity, but the sparse fossil record for most seabird lineages has impeded our understanding of how and when this hotspot developed.”

Because of its unique position both isolated in geography and as the remnants of an entire former continent, New Zealand has had a special ecosystem even compared with Australia’s unusual flora and fauna spread—more like Madagascar or the Galapagos.

“Our analyses provide a timeframe for recruitment of crown penguins into the New Zealand avifauna, indicating this process began in the late Neogene and was completed via multiple waves of colonizing lineages,” the researchers conclude.

https://currently.att.yahoo.com/finance ... 00984.html
Post Reply