Here, Universe Today speaks with Henry Dawson, who is a PhD student in the Department of Earth, Environmental, and Planetary Sciences at Washington University in St. Louis and lead author of the study, about his motivation behind the study, significant results, follow-up studies, and whether Dawson believes there’s life on Europa. So, what was the motivation behind this study?
Dawson tells Universe Today, “A large portion of the community has been looking at the habitability potential of the seafloor, and looking at processes that might occur at seafloor hydrothermal vents, or at water–rock interaction chemistry. However, it was never established that there would actually be any fresh rock exposed at the seafloor, or if the tectonic processes that drive hydrothermal vents would be present. The silicate interior of Europa is a similar size to that of Earth’s Moon, which is largely geologically dead on the surface.”
For the study, Dawson and his colleagues examined the likelihood for geologic activity occurring on Europa’s seafloor through analyzing data on Europa’s geophysical characteristics and comparing them with known geologic parameters and processes, including the strength of potential fault lines and fractures within Europa’s rocky interior, how the strength of this rock changes with depth, and how the rock could react to ongoing stresses, commonly known as convection. Using this, they conducted a series of calculations to ascertain whether the seafloor crust could drive geologic activity. Therefore, what were the most significant results from this study?
“It looks a lot more difficult to expose fresh rock (which is required to drive the reactions that life would exploit) to the ocean,” Dawson tells Universe Today. “Tidal forces do not seem able to cause motion along faults, like it can on the surface, and so the seafloor is most likely still. All the rock that water is able to interact with through porosity was likely altered hundreds of millions to billions of years ago, and so the ocean and rock are in chemical equilibrium. This means that there is no present day, continuous input of nutrients into the ocean from the rocky core, and so any possible life would likely have to exploit nutrient input from the icy shell above the ocean.”
While this study focused on geologic stresses related to fractures and fault lines, Europa’s interior ocean is produced from another type of geologic stress known as tidal heating, which is induced from the constant stretching and compressing as Europa orbits the much more massive Jupiter. This same tidal process occurs between the Earth and its Moon, and we see this in action in the rising and falling of the Earth’s waters around the globe. For Europa, over the course of thousands to millions of years, the stretching and compressing leads to friction in Europa’s inner rocky core, which leads to becoming heated and melting the inner ice into the interior ocean that exists today. It is in this ocean that astrobiologists hypothesize that life could exist, possibly even life as we know it.
However, given these study’s unfortunate findings, Dawson and his colleagues give dire implications for the potential habitability on Europa, noting their calculations estimate that geologic activity on Europa’s seafloor is limited enough to indicate habitable conditions within Europa’s interior ocean could be limited, as well. However, the study was quick to note that other geologic processes could be examined to explain the present state of Europa’s seafloor geologic activity, including processes known as serpentinization and thermal expansion anisotropy.
“As rock is exposed to water and chemically alters, the new minerals that form may have a different molar volume than the unaltered minerals in the original rock,” Dawson tells Universe Today. “Serpentinization specifically is the process where peridotite, a typical mantle rock, is altered to serpentinite. This reaction has a net volume increase, which introduces new stresses. These stresses might lead to the fracturing of the rock, fresh rock faces exposed, and more alteration, leading to a self-propagating cycle. On the other hand, the new minerals might cement up pre-existing fractures, preventing further exposure, and creating a negative feedback loop. Thermal expansion anisotropy describes the process where different minerals have varying degrees of expansion upon heating. Thus, when a rock is heated or cooled, the mineral grains inside will push against each other, introducing porosity and interior stresses.”
Regarding the tidal forces responsible for producing Europa’s interior ocean, this icy moon and the Earth’s Moon are not the only planetary bodies in the solar system that could experience these unique forces. Others include Jupiter’s third Galilean Moon, Ganymede, Saturn’s icy moon, Enceladus, and Saturn’s largest moon, Titan, all of which are currently hypothesized to possess interior oceans from tidal heating. Like Europa, Ganymede exhibits a predominantly crater-free surface, which is indicative of frequent resurfacing, and Enceladus was observed on numerous occasions by NASA’s Cassini spacecraft to have geysers on its south pole region that frequently shoots out water into space.
Additionally, Cassini flew through these geysers to obtain data on the ejecta’s composition, discovering organic molecules. For Titan, Cassini data revealed that an interior ocean exists beneath its surface, which is currently hypothesized to contain a combination of ammonia and salts. But regarding this most recent research, what follow-up studies are currently being conducted or planned?
Dawson tells Universe Today, “I’m currently using the same model to estimate whether tidal forces are able to cause fracturing on other icy moons in the outer solar system, such as Ganymede, Enceladus, Titan, and the mid-size Uranian moons. Based on my preliminary results that I presented at LPSC, it appears that tidal forces are insufficient on those moons as well. In addition, our collaborator Austin Green is looking at whether seafloor volcanism might occur, based on the forces that volcanic dikes can exert on the rock that they are propagating through. For Europa, the lithosphere is too deep and too strong for magma to reach the seafloor, and so any melt that forms in the mantle stalls out at depth.”
Despite being discovered by Galileo Galilei in 1610, the fascination for finding life within Europa’s ocean has only come within the last few decades, thanks largely to the NASA Voyager missions, with Voyager 1 and Voyager 2 flying through the Jupiter system in 1979 and imaged the Galilean Moons up close and in detail for the first time, hinting that Europa was currently geologically active. This is because Europa has almost no visible craters throughout its entire surface, indicating specific processes are responsible for reshaping the small moon and covering up evidence of past impacts. Europa, being the second Galilean Moon, shares these traits with the first and third Galilean Moons, Io and Ganymede, respectively, while the fourth Galilean Moon, Callisto has a surface that is almost entirely covered by craters.
Thanks to further data obtained from proceeding missions, including NASA’s Galileo spacecraft, Hubble Space Telescope, and Juno, scientists are almost entirely convinced that an interior ocean lies beneath Europa’s icy crust, with some estimates putting the volume of liquid water at double of Earth’s oceans. Therefore, as we see on Earth, liquid water means life, which is why Europa’s interior ocean is a target for astrobiology research. But does Henry Dawson think there’s life on Europa?
Dawson tells Universe Today, “I think there’s still a lot more that I would like to understand before I make a yes or no statement on that. While I believe that Europa is one of the most likely candidates to host life, alongside Enceladus, the chance of life remains small, and this research reduces the probability even more.”
This study comes as NASA prepares to launch the Europa Clipper spacecraft this October with a planned arrival date of April 2030 and is designed to explore the habitability potential of Europa and its interior ocean. During its 3.5-year mission, Clipper will perform up to 44 close flybys of Europa ranging between 25 and 2,700 kilometers (16 to 1,678 miles) as the spacecraft will perform elongated orbits to keep from staying within Jupiter’s powerful magnetic field for too long. To assess Europa’s habitability potential, Clipper will carry a powerful suite of scientific instruments designed to analyze Europa’s chemistry, surface geology, and interior ocean characteristics.
Additionally, the European Space Agency’s Jupiter Icy Moons Explorer (JUICE) mission was launched in April 2023 with a planned orbital insertion at Jupiter in July 2031, followed by a departure from Jupiter and an orbital insertion around Ganymede in December 2034. Like Clipper, JUICE is designed to investigate the habitability potential of the icy moon, but will also examine Ganymede and Callisto, as well.
“Get excited for the Europa Clipper and JUICE missions! Dawson exclaims to Universe Today. “While it will still be 6 years before they reach Jupiter, once they arrive, we will be able to learn much more about what is going on at Europa. While they will not be able to directly measure the interior, observations of the ice shell, gravity field, and tidal forcing on Europa will help to constrain future models. As well, always be careful about the assumptions you make for other planetary bodies. While Europa may be covered with ice, it is truly a rocky world that happens to have a deep ocean, and the processes occurring at depth may not reflect what we see at Earth’s seafloor.”
Is Europa’s seafloor geologically active, and what new insights will Europa Clipper and JUICE make about this astonishing and intriguing icy moon in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Europa Might Not Be Able to Support Life in its Oceans appeared first on Universe Today.
]]>In 2023, the ESA put out a call for small lunar missions. The call was associated with their Terra Novae exploration program, which will advance the ESA’s exploration of the Solar System with robotic scouts and precursor missions. “Humankind will benefit from the new discoveries, ambitions, science, inspiration, and challenges,” the ESA explains on their Terra Novae website.
Terra Novae has several goals, one of which is to “Land multiple scientific payloads on the surface of the Moon, prospecting for the presence of water and other volatile materials that will both reveal its history and help prepare sustainable exploration by locally sourced space resources.”
In response to the ESA’s call, a team of European researchers have proposed the LunarLeaper. The LunarLeaper is a hopping robot that would visit a lunar skylight, a collapsed part of a lunar lava tube. The robot would give us our first look at the lunar subsurface and the lava tubes.
There are good reasons to explore these lava tubes. The lunar surface is exposed to solar and cosmic radiation without the benefit of a protective atmosphere or magnetosphere like Earth. Astronauts could shelter in these tubes inside habitat modules. Several meters of rock overhead would provide protection from radiation and from the Moon’s temperature swings. There could be laboratory modules and other modules as well. The tubes, if suitable, could shelter an entire base.
The other reason is scientific. These tubes are a window into the Moon’s volcanic past. They’re a record of the magnitude and timing of volcanic activity.
The LunarLeaper is a ~10 kg (22 lbs) leaping robot with three legs. It’s based on the ETH SpaceHopper design which has been refined over four years of development. SpaceHopper is designed to visit asteroids with much weaker gravity than the Moon, but the design can be adapted to work on the lunar surface.
The LunarLeaper team proposes a mission to the Marius Hills region. It’s a region in Oceanus Procellarum, a vast lunar mare on the near side of the Moon. It’s a volcanic region covered in basalt floods from ancient volcanic activity. Marius Hills is named after the 41 km (25 mi) diameter crater Marius and is littered with volcanic features like rilles, domes, and cones.
The particular feature of interest in Marius Hills is the Marius Hills Pit (MHP), a collapsed skylight granting access into what might be an extensive lunar lava tube system. The Lunar Reconnaissance Orbiter captured an image of the intriguing opening featured in the lead image. That’s where the LunarLeaper would do its work.
The Leaper would move around the rim of the MHP, capturing images of the pit walls and the floor. It would also use its suite of scientific instruments to gather pertinent data. Its instrument suite would include a gravimeter, a ground-penetrating radar, a dedicated science camera, and hopefully a spectrometer.
The LunarLeaper team outlines four questions the mission hopes to answer:
Though there are hundreds of similar pits on the Moon, MHP appears to be the most promising one. It’s been imaged from different illumination angles, and the imaging supports the idea that a tube extends underground beyond the skylight. Since the Marius Hills is filled with volcanic features, an extended tube isn’t unlikely.
The LunarLeaper would travel around the surface near the MHP and use its ground-penetrating radar to uncover the extent of the tube system. Other proposed missions are aimed at lava tubes and skylights, but they tend to be more complex, larger, and more expensive. As a 10 kg hopping robot, LunarLeaper would be a wise choice for the first mission to characterize the MHP prior to sending a more complex, thorough mission.
When it comes to exploring the pit, the LunarLeaper has a significant advantage over a wheeled rover. Wheeled rovers select routes based on obstacle avoidance. They have some strict limitations when it comes to the terrain they can safely and effectively traverse.
However, the rim of the MHP is expected to be challenging. There is likely complex terrain and steep slopes right near the opening. Getting as close as possible to the rim will give better imaging and science results. The LunarLeaper has an advantage over wheeled rovers in this type of terrain, though the tradeoff is its much lighter payload.
However, as a first step in exploring the MHP, the LunarLeaper has some clear advantages.
The LunarLeaper team says that the small robot could be delivered to the lunar surface by one of the several small landers being designed by different companies. They peg the cost at about 50 million euros. They also say that this type of legged jumping robot could be a big part of future space exploration and that their mission, if chosen, could be a key development for the future.
The post It’s Time to Study Lunar Lava Tubes. Here’s a Mission That Could Help appeared first on Universe Today.
]]>It’s not quite a rocket, but this home made model can lift off and land! Here’s the project, from Bribro12 via hackster.io:
]]>In the beginning of this year I was really amazed by the fast and succesive achievements SpaceX made. So I decided I wanted to do a SpaceX related project. Being a SpaceX fan for a few year now, I decided to go for the Falcon 9 inspired look on my build. I’m referring to ‘inspired’ because i took some general proportion from the Falcon 9 rocket but had to make a lot of changes to get the “rocket” functional on the scale I was intending to build it. The “rocket” has a deployable landing gear, thrust vecrtor control unit and uses a 6s 70mm 12 blade fan capable of 2.5 kg of thrust. The thrust vector controll unit is being controlled by a Pixhawk X4 running Ardupilot V4.0.4 which is configured as a single copter. I’m essentially building a single copter in the shape of a rocket.
We’re pleased to release our new GPS jamming map tool. The new map allows users to view areas of GPS jamming and interference around the world in an easy to understand visual format.
The map uses are color coded overlay to indicate low (green) to high (red) levels of interference with global navigation satellite systems (GNSS). Often just referred to as GPS, there are actually multiple systems beside the US GPS constellation, such as Russia’s GLONASS, Europe’s Galileo, China’s BeiDou, and others.
As part of the ADS-B messages we receive from each aircraft, the Navigation integrity category (NIC) encodes the quality and consistency of navigational data received by the aircraft. The NIC value informs how certain the aircraft is of its position by providing a radius of uncertainty. The larger the radius value, the less certain the position update. We use the NIC values broadcast by aircraft passing through a particular area over time to calculate GPS jamming and interference.
Jamming and interference can be displayed in a resolution of six (6) hours or 24 hours. Six hour resolution will show the most recent activity, while 24 hour resolution provides a broader perspective for the previous day.
Use the selector in the upper left corner of the map to adjust the date and time for which you would like to view data. Use the forward and back buttons underneath the date selector to advance one time period in either direction or press the play button to begin a slide show from the date and time selected.
At the moment, GPS jamming is calculated only using using data from areas where there are an adequate number of flights and where there are an adequate number of Flightradar24 terrestrial ADS-B receivers. We are working on incorporating additional sources of data in the near future to further enhance the map.
The post Flightradar24’s new GPS jamming map appeared first on Flightradar24 Blog.
]]>It was hoped that the aptly named Hubble Space Telescope (launched in 1990) would resolve this tension by providing the deepest views of the Universe to date. After 34 years of continuous service, Hubble has managed to shrink the level of uncertainty but not eliminate it. This led some in the scientific community to suggest (as an Occam’s Razor solution) that Hubble‘s measurements were incorrect. But according to the latest data from the James Webb Space Telescope (JWST), Hubble’s successor, it appears that the venerable space telescope’s measurements were right all along.
The research was conducted by the Supernova H0 for the Equation of State of Dark Energy (SH0ES) project, an international effort to eliminate uncertainties in the Hubble-Lemaitre Constant. The team is led by Dr. Adam Guy Riess and consists of astrophysics from the Space Telescope Science Institute (STScI), John Hopkin’s University (JHU), the NSF National Optical-Infrared Astronomy Research Laboratory (NOIRLab), Duke University, the École Polytechnique Fédérale de Lausanne (EPFL), and Raytheon Technologies. Their findings were published in the February 6th, 2024, issue of The Astrophysical Journal Letters.
The Hubble Tension arises from the fact that different distance measurements (aka. the “Cosmic Distance Ladder“) result in different values. For the calibration of short distances or the first “rung” on the ladder, astronomers rely on parallax measurements of nearby stars. For the next “rung,” they rely on Cepheid variables and Type Ia supernovae to measure the distances to objects tens of millions of light-years away. Distance measurements for these stars by Hubble yielded a value of 252,000 km/h per megaparsec (Mpc).
The final rung consists of using redshift measurements of the Cosmic Microwave Background (CMB) to calibrate distances of billions of light-years. The mapping of this background by the ESA’s Planck satellite yielded an estimate of about 244,000 km/h per Mpc (or about 269 km/s per light year). The simplest explanation for the discrepancy was that Hubble‘s measurements were inaccurate, perhaps because of uncertainties in the Cosmic Distance Ladder. Since it was launched in December 2021, the JWST has made its own measurements of Cepheid variables with its advanced infrared optics.
This has allowed astronomers to cross-check the optical-light measurements made by Hubble. This includes Riess, the Bloomberg Distinguished and Thomas J. Barber Professor of Physics and Astronomy at John Hopkins University. In 2011, Riess was awarded the Nobel Prize in Physics and the Albert Einstein Medal for his co-discovery of the accelerating rate of cosmic expansion – which led to the theory of “Dark Energy.” The team’s first look at Webb’s observations in 2023 confirmed that Hubble’s measurements of the expanding Universe were accurate.
Their latest analysis was based on Webb’s observations of over 1,000 Cepheids used as “anchors” in the distance ladder, eight Type Ia supernovae, and NGC 5468 – the farthest galaxy where Cepheids have been well measured, roughly 130 million light-years distant. As Riess stated in an ESA press release, these findings have erased any lingering doubt about measurement errors:
“With measurement errors negated, what remains is the real and exciting possibility that we have misunderstood the Universe. We’ve now spanned the whole range of what Hubble observed, and we can rule out a measurement error as the cause of the Hubble Tension with very high confidence.”
In particular, these findings have eliminated any lingering doubts that measurement inaccuracies might grow with distance. These inaccuracies would result from “stellar crowding,” where light from the Cepheids blended with that of adjacent stars. For many astronomers, the prospect of looking deeper into the Universe meant that these errors would become visible. Accounting for this effect is made all the more difficult thanks to intervening dust in the interstellar and intergalactic medium (ISM, IGM) that naturally obscures visible light.
Thanks to Webb’s sharp imaging capabilities at infrared wavelengths, astronomers can now see through the obscuring dust and get a clearer look at distant Cepheids. Combined with Hubble’s observations, the SH0E team determined that Hubble‘s observations were correct. As a result, said Riess, scientists are left with only one explanation for the Hubble Tension, which is that there is an unseen force responsible for how the cosmos is expanding:
“Combining Webb and Hubble gives us the best of both worlds. We find that the Hubble measurements remain reliable as we climb farther along the cosmic distance ladder. We need to find out if we are missing something on how to connect the beginning of the Universe and the present day.”
Next-generation telescopes will investigate this mysterious unseen force in the coming years by measuring its influence on cosmic expansion. This includes NASA’s upcoming Nancy Grace Roman Space Telescope and the ESA’s Euclid mission (which launched on July 1st, 2023). Paired with additional data obtained by Webb, these observations will allow astronomers to test “early Dark Energy” and other theories that attempt to explain the observations of Hubble and Webb. In the meantime, the so-called “crisis in cosmology” will persist, but perhaps not for long.
Further Reading: ESA
The post Webb Continues to Confirm That Universe is Behaving Strangely appeared first on Universe Today.
]]>The Vela supernova remnant is visible in long exposure photographs in the constellation Vela. It is the result of a star more massive than the Sun reaching the end of its life. As the progenitor star evolved the fusion deep in its core ceased. The lack of fusion means the cessation of the outward pushing thermonuclear force, the star instantly implodes under the immense force of gravity. The inward rushing material rebounds leading to the supernova explosions we see. The shockwave from the event is still travelling through the surrounding gas cloud thousands of years later.
The image recently released is one of the largest images ever taken of the object with the DECam camera. The instrument, built by the Department of Energy, was mounted upon the 4 metre Victor M Blanco telescope in Chile. It reveals amazing levels of detail with red, yellow and blue tendrils of gas. The image was taken through three colour filters in a technique familiar to amateur astronomers. The filters capture specific wavelengths of light and are then stacked on top of each other during processing to reveal the stunning high resolution colour image.
Supernova explosions of this type take hundreds of thousands of years for the effects to dissipate however the core of the collapsed star does remain. As the star collapses, the core is compressed leaving an ultra dense sphere of neutrons, the result of protons and electrons having been forced together under extreme pressures. The Vela Pulsar is only a few kilometres across but contains as much mass as the Sun. The stellar remnant is rotating rapidly, sweeping out a powerful beam of radiation across the Galaxy at a speed of 11 times per second.
Previous images from other instruments highlight the incredible capabilities of DECam. Coupled up to the 4 metre telescope in Chile, it operates like a conventional camera. Light enters the telescope and is redirected back up the tube by the large mirror. The light passes into DECam, through a 1 metre corrective lens and then arrives at its final destination, a grid of 62 charge-coupled devices. These little sensor generate current dependent on the amount of light that falls upon them. With an array of these sensors (570 million of them to be exact), a high resolution image can be recreated!
Source : Ghostly Stellar Tendrils Captured in Largest DECam Image Ever Released
The post This is a 1.3 Gigapixel Image of a Supernova Remnant appeared first on Universe Today.
]]>Hackster.io user JP Funk shared their IoT Deep Dive Boot Camp Mid Term Project 2 Biosphere that uses Adafruit IO and a Monochrome 1.3″ 128×64 OLED graphic display:
JP’s IoT Deep Dive Boot Camp Mid Term 2 “Space Biosphere” Automatic Plant Watering Project with Adafruit.IO Dashboard. The Plant watering project is designed to employ the Particle systems Photon 2 with automated code functions receiving the values every 30 minutes from the Seeed Capacitive Soil Sensor, translating the value into a function that detects when the soil is reaching a dry state then turning on the micro pump watering the plant for 1 second duration. The system also utilizes Sensors to monitor the Dust, Air, Temp, Humidity and Barometric Pressure, reading out the values locally on the OLED display and also sending them to the Adafruit.IO Dashboard. The Water pump can also be activated On/Off through the Adafruit Dashboard for wireless control.
]]>
We tend to think of stars as stationery objects in the sky, except for their slow westward drift across the sky as the Earth rotates. The reality is different though, stars do move but due to the vast distances in interstellar space, that motion is largely not noticeable. There are exceptions such as Barnard’s star in the constellation Ophiuchus. This inconspicuous red dwarf star moves 10.39 seconds of arc each year (by comparison, the full Moon is 1,900 seconds or arc in diameter.)
Another type of star can be observed, hypervelocity stars (HVSs), and these are among the fastest objects in the Galaxy. They are defined as stars that have a velocity which is of the order 1,000 km per second and by comparison, the Earth travels through space at a velocity of around 30 km per second! The first was discovered in 2005 but since then a number of HVSs have been found, and some of them have the potential to escape from the Milky Way.
Typically the motion of stars is the result of their motion around the centre of a galaxy. Our own star the Sun, takes 220 million years to complete one orbit of the centre of the Milky Way. The origin of the HVSs high velocity is believed to stem from gravitational interactions between binary stars and black holes. The idea was proposed by Jack Gilbert Hills is a stellar dynamicist, born on 15 May 1943. In this process, a black hole (stellar or the supermassive black hole at Galactic centre) captures one of a binary star system while the other gets ejected at high velocity. Other theories include ejection of one of a binary star system when the other goes supernova or from galactic interactions.
To understand the interactions between the Milky Way and the Andromeda Galaxy the team (led by Lukas Gülzow from the Institute for Astrophysics in Germany) had to go through painstaking analyses. First they had to understand the relative motion fo the two galaxies, they then had to model the gravitational potential of the entire system – this is the total acceleration acting upon an object at any position in either of the galaxies at any time. Finally the team could generate simulations of stellar motion to model the HVSs trajectories.
The study calculated the trajectories of 18 million HVSs for two different scenarios taking into account the two galaxies having equal mass and the other with the Milky Way having about half the mass of the Andromeda Galaxy. The starting positions of the HVSs in the simulation were randomly generated around the centre of Andromeda. The ejection directions were random and the results showed that 0.013 and 0.011 percent of HSVs are now within a radius of 50kpc around the Milky Way centre.
The explored the velocity of HVSs on arrival with both galaxy mass simulations and found that many approximately retain their initial velocity. Interestingly due to the time taken for the journey, a significant proportion may well evolve off the main sequence during their journey. Some of the HVSs slow down sufficiently to be captured by the Milky Way.
The team mapped the simulated position of stars against the sky and ran the data against high velocity star positions from Gaia data (Release 3) and found the simulated position distribution consistent with the Gaia data. The study concludes that it is highly likely that HVSs from Andromeda could indeed migrate to the Milky Way. Whilst they are not expected in their thousands, they are expected to distribute equally around the Milky Way centre. It might even be possible to detect them based on stellar velocity and trajectories but further studies are now required to take that next step.
Source : On Stellar Migration from Andromeda to the Milky Way
The post Are Andromeda and the Milky Way Already Exchanging Stars? appeared first on Universe Today.
]]>Figure 1: Scientific observations with GNSS radio occultation (GNSS-RO), GNSS grazing-angle reflectometry (GNSS-GR) and GNSS reflectometry (GNSS-R) techniques from low-Earth orbit (LEO). (Figure provided by the author)
Global navigation satellite systems (GNSS) for peaceful uses are facing a hard reality due to increasing regional conflicts in recent years. As a dual-use technology, GNSS for civil, commercial and scientific applications is vulnerable to both denied/degraded service and flex power operation from GNSS satellites and to jamming from the ground.
One of the vulnerable scientific applications is the use of GNSS receivers on low-Earth orbit (LEO) satellites that utilize the civil navigation signals for Earth observation. These remote sensing techniques, such as GNSS radio occultation (GNSS-RO), GNSS grazing-angle reflectometry (GNSS-GR) and GNSS reflectometry (GNSS-R) (see figure 1), are designed to observe weak GNSS signals either bounced off from Earth’s surface or refracted by the atmosphere. Thus, GNSS flex power operation and intentional radio frequency interference (RFI) can severely degrade the quality of the scientific data or even prevent Earth observation.
One example of such impacts is a dramatic decrease of GNSS-RO observations over Europe and the Middle East during 2023. Monthly statistics from Spire show the region without GPS-RO measurements grew substantially from the localized Ukraine-Russia conflict zone in January to a much wider area in Eastern Europe and the Middle East in December 2023 (see figure 2).
Figure 2: Number density distribution of monthly GNSS-RO measurements from the GPS tracking by the Spire constellation over Europe and the Middle East in 2023. The black area indicates no data. (Figure provided by the author)
This vast data void in the science observation is likely a result of the intensified electronic warfare used in Ukraine-Russia and the nearby conflict regions. The Spire RO receivers are configured to track the civil signals from GPS, GLONASS and Galileo. To increase signal protection against jamming in a conflict zone, GNSS service providers often use flex power operation. However, flex power operations can cause poor quality tracking with the RO receiver due to weaker signal power. Unlike a precise orbit determination (POD) antenna, the GNSS-RO antennas typically have a high gain to improve the detection of weak GNSS signals at limb and occulted views. However, if the transmitter power of civil signals drops below a quality-control (QC) threshold, the data are flagged as bad. This results in a poor coverage of Spire GNSS-RO in the conflict zones.
Lost or degraded GNSS-RO, GNSS-R and GNSS-GR observations are unfortunate, as these all-weather sensing, long-term stability, and high-accuracy measurements are becoming increasingly important in scientific research. GNSS-RO is a remote sensing technique that uses the GNSS-LEO link to profile Earth’s atmosphere and ionosphere with high vertical resolution. Since the first GNSS-RO six-satellite constellation, known as Constellation Observing System for Meteorology, Ionosphere and Climate-1 (COSMIC-1), these high-quality RO profiles have become a key data source for weather forecasting, climate monitoring, model evaluation, and space weather research. The current backbone of GNSS-RO observations comes from the COSMIIC-2 and Spire constellations, which have been producing more than 20,000 profiles per day since 2020. GNSS-R is a bi-static radar technique that uses the GNSS signals reflected by the surface for altimetry, ocean surface wind speed, wave height sea ice, soil moisture, and inundation measurements. At a view angle between GNSS-RO and GNSS-R, GNSS-GR can provide complementary measurements for sea ice and atmospheric column water vapor. Because of low-cost LEO SmallSat/CubeSat constellations with the GNSS receivers, geoscience studies have benefited greatly from the sampling density and coverage of these new data.
Civilization and science have been diverted by wars before. Despite the increased dependence on GNSS in recent years, their vulnerability to jamming and flex power operation poses a great challenge for scientific observations that need uniform global coverage.
The post Research Report: A Black Hole in Earth science first appeared on GPS World.
]]>I think Captain James T Kirk would be proud of NASA for boldly going. This time with another message to the Cosmos on board the Europa Clipper. The destination is Jupiter’s moon Europa which has an icy crust and it is thought, a subsurface ocean. If the ocean exists, and all evidence seems to point to its presence, then there is likely twice as much water by volume than here on Earth. The plaque has been attached to commemorate the connection between the two worlds.
The triangular shaped tantalum metal plaque measures about 18x28cm and has an engraving of a handwritten poem by Ada Limon “In Praise of Mystery: A Poem for Europa”. The 2.6 million names are engraved upon a silicon microchip that is in the centre of an illustration of a bottle among the Jovian system, NASA’s message in a bottle.
In a statement, Lori Glaze, director of Planetary Science Division at NASA said “The plate combines the best humanity has to offer across the Universe – science, technology, education, art and math.” He went on to say “The message of connection through water, essential for all forms of life as we know it, perfectly illustrates Earth’s tie to this mysterious ocean world we are setting out to explore.”
One perhaps more controversial inclusion is the famous Drake Equation. Scientists have been divided about the validity and benefit of this equation which was developed by Frank Drake in 1961. Drake’s equation attempts to answer the question, using mathematics, of how many advanced civilisations there may be in our Galaxy. Aside from its varied levels of support, the equation has been etched onto the plate as well, on the inward facing side.
The probe is scheduled to launch later this year and, after a 2.6 billion km journey, will arrive at Europa in 2030. It will then begin making a total of 49 flyby’s of Europa to try and establish if the conditions could support life. To that end, it will have a host of instruments to explore the subsurface ocean, the crust, the atmosphere and the space environment around the moon. To ensure the instruments don’t fail in the high levels of radiation from Jupiter, they are housed in a metal container with one of the openings sealed by the plaque.
The illustrations don’t just advertise what we are like, they also depict how we communicate. References are made to radio frequencies that we could use for interstellar communication just in case an alien civilisation intercepts the probe some time in the future. It reveals how we use radio bands to listen out for alien signals and includes the frequencies emitted by water.
If all of that wasn’t enough, in a lovely touch and a nod to one of the founders of planetary science and advocate for the mission, there is a portrait of Ron Greeley too. It was he who laid the very building blocks for the mission and it is a fitting gesture that he should be travelling to Jupiter with the craft he dreamed of.
Source : NASA Unveils Design for Message Heading to Jupiter’s Moon Europa
The post This is Europa Clipper’s Version of the Golden Record appeared first on Universe Today.
]]>They need a radio telescope that’s just one single, massive dish.
Many astronomical objects emit radio waves. From massive galaxies to individual molecules, radio waves and the observatories that sense them provide insights into these objects in ways that other observatories can’t. But there’s a problem. In order to do radio astronomy with a usable signal-to-noise ratio, astronomers need huge antennae or dishes. That’s why ALMA exists. It’s a collection of dishes working together via interferometry to create a much larger dish.
But as powerful as ALMA is, and as much as it continues to make a huge contribution to astronomy, it has its limitations.
That’s why some in the astronomical community are calling for a new radiotelescope with one single large dish. It’s called AtLAST, for the Atacama Large Aperture Submillimeter Telescope, and the idea has been fermenting for a few years. Now, a new paper is fine-tuning the idea.
The paper is “Design of the 50-meter Atacama Large Aperture Submm Telescope,” and it’s currently in pre-print. The lead author is Tony Mroczkowski, an astronomer and submillimetre instrument specialist at the European Southern Observatory (ESO), one of the organizations behind ALMA.
“Submillimetre and millimetre wavelengths can reveal a vast range of objects and phenomena that are either too cold, too distant, or too hot and energetic to be measured at visible wavelengths,” the paper states. They point out that the astronomical community has “highlighted the need for a large, high-throughput sub-mm single dish” radio observatory that can advance radio astronomy.
“The Atacama Large Aperture Submillimeter Telescope (AtLAST), with its 50-m aperture and 2o maximal field of view, aims to be such a facility,” they explain.
Their paper presents the full design concept for AtLAST.
AtLAST’s large 50-meter aperture is its critical feature. Smaller apertures, even when combined together in an interferometer like ALMA, can only see more extreme features due to noise. That’s why two or more smaller dishes can’t replace a single large one.
There are some large-aperture radio antennae, like the Japanese Nobeyama 45 m telescope and the IRAM 30 m telescope. But due to their designs they can’t observe as well as AtLAST will. AtLAST will be able to see closer to the spectral energy distribution (SED) peak of galaxies and will be able to observe far infrared (FIR) emission lines in the interstellar medium and in high-redshift galaxies. ALMA can observe these SEDs and FIRs, but not as well as AtLAST will.
Existing large dishes also have smaller fields of view (FOV.) But AtLAST’s design was driven by the need for a larger FOV of 2 degrees. This will give AtLAST a much higher mapping speed for science cases that need large fields of several hundred degrees square.
AtLAST’s overarching scientific goal is multifaceted. The telescope will perform the most complete, deepest, and highest-resolution survey of the Milky Way. This includes gas clouds, protoplanetary disks, protostars, and dust. AtLAST will even survey some parts of the Local Group of Galaxies. The radio telescope will even be able to detect complex organic molecules, the precursors to life.
The gas and dust in the Universe is of particular interest to AtLAST. Much of the gas and dust in the Universe is cold and dense. The interstellar medium (ISM) consists of clouds of gas and dust that have unique spectral signatures in the sub-millimetre range. ALMA has given us some of our best looks at these structures with high-resolution images of some of the fine details of the ISM. But single-dish antennae have given astronomers glimpses of other discoveries waiting to be made. That’s one of the reasons the international astronomy community is so enthusiastic about AtLAST.
AtLAST will also be able to take a census of star-forming galaxies at high redshifts. It’ll also map out the reionization of the Universe and track the Universe’s dust, gas, and metallicity across cosmic time.
AtLAST will dig into the deeper, fundamental aspects of galaxies by examining the circumgalactic medium (CGM). The CGM is cold gas and dust that exists in galactic haloes and shapes the evolution of galaxies. This material is invisible at other wavelengths.
The radio telescope’s single-dish design has some advantages over ALMA that are separate from its dish size and its field of view. As a single-dish antenna, AtLAST will be able to switch targets quickly and even track moving targets. It’ll employ several different scanning modes, as well as tracking modes that allow the telescope to track comets, asteroids, and near-Earth objects. Its innovative rocking chair design is behind some of AtLAST’s performance, a design it shares with extremely large optical telescopes like the ELT.
AtLAST will be designed to last many decades. It’ll have six instrument bays and will allow rapid switching between instruments. With a nod to our changing climate, AtLAST will be powered by renewable energy.
But what it’s really all about is science.
“The design presented here is expected to meet all of the specifications set for AtLAST to achieve its broad scientific goals,” the paper states. The details of the design allow it to meet the stringent requirements needed to reach its goals. “Namely, these are the large field of view, the high surface
accuracy, fast scanning and acceleration, and the need to deliver a sustainable, upgradeable facility that will serve a new generation of astronomers and remain relevant for the next several decades.”
It’s a complex project, as are all astronomical observatories. But as technology advances, so does the complexity. There’s a lot of work yet to be done and quite a bit of time before construction can even begin.
“Despite the amount of work that remains to be done, AtLAST is on track to potentially begin construction, if fully funded, later this decade,” the authors conclude.
The post Astronomers Propose a 50-Meter Submillimeter Telescope appeared first on Universe Today.
]]>Anyone who’s ever owned an older car will know the feeling: the nagging worry at the back of your mind that today might be the day that something important actually stops working. Oh, it’s not the little problems that bother you: the rips in the seats, the buzz out of the rear speakers, and that slow oil leak that might have annoyed you at first, but eventually just blend into the background. So long as the car starts and can get you from point A to B, you can accept the sub-optimal performance that inevitably comes with age. Someday the day will come when you can no longer ignore the mounting issues and you’ll have to get a new vehicle, but today isn’t that day.
Looking at developments over the last few years one could argue that the International Space Station, while quite a bit more advanced and costly than the old beater parked in your driveway, is entering a similar phase of its lifecycle. The first modules of the sprawling orbital complex were launched all the way back in 1998, and had a design lifetime of just 15 years. But with no major failures and the Station’s overall condition remaining stable, both NASA and Russia’s Roscosmos space agency have agreed to several mission extensions. The current agreement will see crews living and working aboard the Station until 2030, but as recently as January, NASA and Roscosmos officials were quoted as saying a further extension isn’t out of the question.
Still, there’s no debating that the ISS isn’t in the same shape it was when construction was formally completed in 2011. A perfect case in point: the fact that the rate of air leaking out of the Russian side of the complex has recently doubled is being treated as little more than a minor annoyance, as mission planners know what the problem is and how to minimize the impact is has on Station operations.
While the leak might have been generating some additional buzz over the last week or two, this is only the latest chapter in a story that’s been unfolding for several years.
You can find similar headlines popping up every year or so since at least 2019, and even back that far, it was noted that the Station was constantly losing breathable atmosphere to some degree. It’s only considered a proper “leak” when ground controllers see a notable spike in the normal amount of air being lost.
By 2020, the rate of air being lost was getting to the point that NASA and Roscosmos decided it was worth spending some time to investigate. So during a (relatively) slow operational period, with only three crew members aboard, all of the inter-module hatches were closed throughout the Station. The air pressure in each module was then carefully monitored over the next several days, an effort which ultimately determined the leak was somewhere within the Russian Zvezda module.
Once it was determined the leak was on the Russian side of the complex, cosmonauts started a more localized search. By 2021, they were watching thin strips of paper and tea leaves as they were carried by air currents within Zvezda. This allowed them to identify a few cracks in the hull where air was escaping, which were taped up to help slow the bleed.
Our latest update comes from NASA’s International Space Station Program Manager, Joel Montalbano. In a February 28th media briefing ahead of the launch of the SpaceX Crew-8 mission, Montalbano explained that the Station leak rate doubled to approximately 0.9 kilograms air per day as preparations were made to dock the Progress MS-26 cargo spacecraft to the Station.
It was eventually determined that the leak was within the one-meter long vestibule in the rear of Zvezda known as the PrK, which acts as a sort of air lock between a visiting Progress spacecraft and the rest of the module. The leak rate only increased when the inner hatch to this chamber was open, and went back to normal as soon as it was closed. Since this hatch only needs to be open during active loading and unloading of a Progress vehicle, NASA feels confident that it presents no risk to the crew or the Station.
Even still, Montalbano did say the situation was being continuously monitored from the ground, and that Russian engineers are currently looking into locating the leak within the PrK and patching it permanently. It was also explained that, thanks to the nature of the PrK chamber, even if the leak were to become worse and found to be beyond repair, it would pose no risk to the Zvezda module. It could potentially mean permanently losing access to the docking port on the other side of the PrK however — an unfortunate, but not insurmountable, situation.
]]>That’s what an international team of astronomers wants to know. “The first few hundred million years of the Universe was a very active phase, with lots of gas clouds collapsing to form new stars,” said Tobias Looser from the Kavli Institute for Cosmology at the University of Cambridge. “Galaxies need a rich supply of gas to form new stars, and the early universe was like an all-you-can-eat buffet.”
So, when the galaxy JADES-GS-z7-01-QU showed up in a JWST observation, it didn’t exhibit much evidence of ongoing star formation. (JADES stands for JWST Advanced Deep Extragalactic Survey.) It’s in what astronomers refer to as a “quenched” state and looks like star formation started and quickly stopped. Figuring out why this happened to the young galaxy is an important step in cosmology. Why did it stop creating stars? And, were the factors that affect star formation the same then as they are today?
Star-formation quenching is something astronomers don’t expect to happen quickly. “It’s only later in the universe that we start to see galaxies stop forming stars, whether that’s due to a black hole or something else,” said Dr Francesco D’Eugenio, also from the Kavli Institute for Cosmology and a co-author with Looser on a recent paper about JADES-GS-z7-01-QU.
Star birth usually begins as clouds of gas coalesce together. Gas-rich regions, including galaxies, are prime spots for star-birth nurseries. JWST data about JADES-GS-z7-01-QU shows that this baby galaxy experienced a very intense period of star formation shortly after it began forming (after the Epoch of Reionization). For somewhere between 30 to 90 million years, it was ablaze with star formation. Then, suddenly, it stopped.
That’s not surprising—although astronomers aren’t sure why it stopped. Clearly, it ran out of gas. Maybe a supermassive black hole at its heart gobbled up much of the available “star stuff”. The black hole’s rapidly moving winds and jets could also have shoved a great deal of the star-birth material completely out of the galaxy. It’s also possible that the very rapid pace of star formation that JADES-GS-z7-01-QU experienced simply used up the supply. That’s not impossible, according to Looser. “Everything seems to happen faster and more dramatically in the early universe, and that might include galaxies moving from a star-forming phase to dormant or quenched,” he said.
It’s not clear from the current JWST data what happened to this little galaxy back at the dawn of time. Astronomers are still probing the data. “We’re not sure if any of those scenarios can explain what we’ve now seen with Webb,” said paper co-author Professor Roberto Maiolino. “Until now, to understand the early Universe, we’ve used models based on the modern universe. But now that we can see so much further back in time, and observe that the star formation was quenched so rapidly in this galaxy, models based on the modern universe may need to be revisited.”
That means more observations using JWST. “We’re looking for other galaxies like this one in the early universe, which will help us place some constraints on how and why galaxies stop forming new stars,” said D’Eugenio. “It could be the case that galaxies in the early universe ‘die’ and then burst back to life – we’ll need more observations to help us figure that out.”
There’s one other possibility that astronomers will want to probe. JADES-GS-z7-01-QU looked dead at the time of its life when JWST observed it. But, it’s possible that the star-birth quenching was only a temporary thing. Maybe it was caused by periodic outflows of star-stuff material to interstellar space (driven by the black hole in the nucleus). Other galaxies have also been observed to be taking a star-birth break, but they’re much more massive than this one.
Perhaps JADES-GS-z7-01-QU started up the star-forming factory later in its history. In that case, it could well have grown much more massive in later epochs of cosmic history. And, this provides an intriguing idea: perhaps other “quenched” galaxies also took a break, then got a massive infusion of gas—perhaps through collisions with other galaxies—to create later generations of stars. Future JWST observations should uncover more of these galaxies and that should allow astronomers to study their quenched phases in more detail.
Astronomers Spot Oldest ‘Dead’ Galaxy Yet Observed
A Recently Quenched Galaxy 700 Million Years After the Big Bang
A Recently Quenched Galaxy 700 Million Years After the Big Bang (arXiv preprint)
The post This Galaxy Was Already Dead When the Universe Was Only 700 Million Years Old appeared first on Universe Today.
]]>The time has come. Seven years ago on an August afternoon, the shadow on the Moon swept across the United States. Now we’re in the one month stretch, leading up to the big ticket astronomical event for 2024: the April 8th total solar eclipse spanning North America.
This is the last total solar eclipse for the ‘lower 48 states’ until August 23rd, 2044. Totality does nick remote northwest corner of the state of Alaska on March 30th, 2033. The path of totality on April 8th spans Mexico, the contiguous United States from Texas to Maine, and the Canadian Maritimes.
The eclipse will be partial from southeast Alaska, all the way down to the very northwest edge of South America. Hawaii will see a rising partial. On the other end, Iceland and the very western coast of Ireland will see the reverse underway at sunset.
The first eclipse season of 2024 actually begins on the night of Sunday/Monday March 24/25. A penumbral lunar eclipse that night puts the whole celestial game into play. This subtle eclipse is visible from the Americas. Don’t expect to see much more than a slight ragged darkening on the southwest limb of the Moon around 7:12 Universal Time.
Though it’s a slight affair, this penumbral eclipse means that the nodes where the Moon’s path intersect the ecliptic are aligning for the total solar eclipse two weeks later. Though the 2017 event was an ascending node eclipse, the 2024 one is a descending node event, crisscrossing the path.
This eclipse is member 30 of the 71 eclipses in solar saros series 139. This saros began way back on May 17th, 1501, and produced its first fully total solar eclipse (as opposed to a hybrid annular-total) on December 21st, 1843. It’ll cease doing so with the brief total solar eclipse of March 26th, 2601, and finally end on July 3rd, 2763.
One famous alumni for saros 139 occurred one exeligmos (three saroses or 54 years) ago on March 7th, 1970. This eclipse moved right up the U.S. East Coast in a path just slightly east of the upcoming eclipse. The three saros period is crucial, as each pass shifts the path 120 degrees in longitude westward, and three brings it nearly back around the globe full circle. The 1970 eclipse is one of two suspects referenced in Carly Simon’s song You’re so Vain… and the April 8th eclipse passes over the very tip of northern Nova Scotia. Will someone once again take their “Learjet to Nova Scotia, to see a total eclipse of the Sun?”
To be sure, we enjoy living in an epoch on a planet where total solar eclipses can occur… but this won’t always be the case. The Moon is slowly receding from the Earth, meaning that in about 600 million years time, all solar eclipses will be partial or annular only. Already, in the current 5,000 year epoch, annulars are now more common than totals. We’re also not the only place in the solar system where you could stand and see a moon versus the Sun in a close fit; the surfaces of the Jovian moons witness something similar about twice a decade.
On Monday April 8th, the action begins when the penumbral (partial) shadow of the Moon first touches down over the South Pacific at 15:42 Universal Time (UT). Then, the inner umbral shadow touches down over the south-central Pacific at 16:42 UT, sweeping its way to the northeast. The shadow then first makes landfall over the Pacific coast of Mexico at 18:09 UT, and reaches its maximum duration of 4 minutes and 28 seconds over northern Mexico just shy of the Texas border.
This eclipse is on the long side of medium, with a maximum totality of just over three minutes shy of the maximum 7 minutes 32 seconds possible.
The 198 kilometer-wide shadow then continues to sweep 2,517 kilometers per hour to the northeast, intersecting the path of the 2017 eclipse over the states of Missouri, Illinois and Kentucky around 19:00 UT. Continuing its trek, the shadow then ranges over Lake Erie, northern New England and the Canadian Maritime provinces until departs the Earth over the North Atlantic at 19:55 UT. The final partial phases of the eclipse wrap up at 20:52 UT.
Millions live along the path of totality or within an easy day drive from the path. Major cities, including Dallas-Fort Worth, Indianapolis and Buffalo are all in the eclipse path. It’s well worth it to make the trip to the path to witness a total solar eclipse; even a deep 99% partial (such as an annular eclipse) is still pretty bright, something you might not notice otherwise.
“We urge anyone who can to go inside the path of total solar eclipse on April 8,” Michael Zeiler (Great American Eclipse) told Universe Today. “It will be an amazing experience when the sunlight suddenly disappears and the Sun’s stunning corona shimmers in the darkened sky. A total solar eclipse is nature’s most beautiful sight and you will never regret the effort to go see totality. If hotels are booked, stay with a friend or relative or go camping.”
“If someone in a location of 95% partial solar eclipse and says they will see most of the interesting phenomena, sorry but they’re wrong,” says Zeiler. “You have to be inside the path of totality with clear skies to see the full glory of totality. It’s the difference between watching the World Series final game in person or staying in a car in the stadium parking lot listening to the radio.”
Proper safety precautions must be adhered to during all partial phases of the eclipse. This means covering finder-scopes, and either projecting the eclipsed Sun or using eclipse glasses meant for solar viewing. Approved glasses are stamped ISO 12312-2-2015 on the arms. Check those 2017 eclipse glasses in the daylight for cracks or pinholes before using them on eclipse day. NASA has a good page on eclipse safety, and tips on building a pinhole projector.
We should know just what the weather might do about a week out from eclipse day. Likewise, we should start to have an idea of just how photogenic the partially eclipsed Sun will be in terms of sunspots, with a peek at what’s starting to rotate into view around April 1st. We’re nearing maximum for Solar Cycle 25, so we could be in for a fairly active Sun.
Best bets for clear skies are on Texas and Mexico, though April cloud cover can be fickle along the entire track. Keep in mind, you don’t need a crystal clear sky to see the eclipse; just a good view of the Sun. We had memorable views of the partially eclipsed Sun in 2017 leading up to totality, filtered though an approaching cloud bank.
Mobility and road access is key on eclipse day. Range and options dwindle hours prior as to where to head to to observe. NOAA’s GOES-East is a great site to see how the potential cloud cover situation is developing, come eclipse day. Don’t despair if clouds thwart the view: nearly every eclipse chaser has at least one story of the one that got away, and plans made to head to the next.
As the partial phases deepen, watch for crescent Suns dappling the ground. These are cast though natural pinhole projectors such as gaps in tree leaves and lattice-work. Spaghetti strainers or cheese graters are great tools for replicating this effect. Projecting the Sun back on a high contrast surface such as a piece of white paper can really enhance the view.
If it’s your first time experiencing totality, I’d advise you to simply enjoy the experience. The scant few minutes of totality goes by pretty quickly. Most people are surprised by the abrupt transition from broad daylight, to an eerie otherworldly twilight. You can drop the glasses as totality begins, and note the glow that circles the horizon. Jupiter and Venus will be visible near the eclipsed Sun. Also, watch for the +1st magnitude stars Aldebaran, Betelgeuse and -1st magnitude Sirius, all above the general horizon. Imagers may be treated views of Comet 12P Pons-Brooks, just two weeks from perihelion.
Fun fact: comets have been discovered during eclipses, as occurred on November 1st, 1948.
Totality is the only time you’ll see the corona, the ethereal outermost atmosphere of the Sun. The streamers of the corona can look different from one eclipse to the next. Seasoned eclipse chasers can actually tell which eclipse a given image is from, based on the appearance of the corona.
Temperatures may drop, and nocturnal wildlife may be briefly fooled by the onset of a false dusk. In 2017, we suddenly faced an onslaught of mosquitoes as totality fell over the Smoky Mountains of North Carolina.
As totality deepens, ask yourself: what would you think, centuries or millennia ago, if you were going about your daily business and such an event occurred, without warning?
These days, it is possible to nab a quick photo during totality with a smartphone camera. Be sure to shoot in RAW/Pro mode, and have your settings at the ready. Totality comes and goes very quickly. Here’s a great link to shooting an eclipse with your smartphone, and DSLR settings for totality. Check out this amazing smartphone eclipse video, courtesy of Tom Kerss:
— Tom Kerss FRAS (@tomkerss) March 6, 2024
The reappearance of the ’diamond ring’ effect as sunlight streams down the valleys along the lunar limb signals that its time to put the eclipse glasses back on. Folks along the edge of the path may witness a string of similar flashing effects known as Baily’s Beads. Key sites may also see the elusive ‘double diamond ring’ effect.
Bitten by the ‘eclipse bug?’ The next total solar eclipse isn’t until August 12th, 2026 across Greenland, Iceland, and northern Spain. Incidentally, Spain becomes totality central after 2024. Two more eclipses grace the Iberian peninsula: a total on August 2nd, 2027 and an annular on January 26th, 2028.
Lots of amateur and professional projects are also underway leading up to the eclipse. We also typically see amazing views of the eclipse from space. These include views from ESA’s Proba-2 mission, NOAA’s GOES satellites, and from the International Space Station.
Also, expect NASA to livestream the event, come eclipse day.
And me? In an act of astronomical hubris, I’m once again tempting clouds and heading to northern Maine come eclipse day. This one has a special significance for us. It’s the only time that totality graces my hometown of Mapleton, Maine for this century. My rationale is, if we’re clouded out, we’ll then have an argument to chase after the next one…
Good luck, good eclipse chasing to all that live in or are headed to the path of totality, and clear skies!
The post Into Totality: Our Complete Guide to the April 8th Total Solar Eclipse Across North America appeared first on Universe Today.
]]>An atomic clock is a type of clock that uses the vibrations of atoms to measure time. The Royal Institution shared this short video on Youtube!
]]>Why do we need to coordinate the way we measure time? And how do atomic clocks work? Find out with Leon Lobo, Head of the National Timing Centre (NTC) programme at the National Physical Laboratory.
This video is part of our celebration of British Science Week 2024, which this year has the theme of ‘time’. Find out more at https://www.britishscienceweek.org/
Here are some things we know to be true: Gravity bends spacetime. Light must curve along those bends. Curved light can create lensing. Lensing leads to magnification. What does all that mean? Gravitational lensing can be used to create telescopes more powerful than anything that has ever been deployed before. A gravitational lens telescope using the impact of the sun on spacetime could be powerful enough to map the surfaces of exoplanets. Here’s a video from Launch Pad Astronomy and an article from Popular Science on the creation of a solar gravity lens:
]]>…Slava Turyshev, a scientist at NASA’s Jet Propulsion Lab, is trying to harness one of these gravitational lenses closer to home, using our sun. In a new paper posted to the pre-print server arXiv, Turyshev computes all the detailed math and physics needed to show that it is actually possible to harness our sun’s gravity in this way, with some pretty neat uses. A so-called “solar gravitational lens” (SGL) could help us beam light messages into the stars for interstellar communication or investigate the surfaces of distant exoplanets.
“By harnessing the gravitational lensing effect of our star, astronomy would experience a revolutionary leap in observing capability,” says Nick Tusay, a Penn State astronomer not involved in the new work. “Light works both ways, so it could also boost our transmitting capability as well, if we had anyone out there to communicate with.”
The search for biosignatures on potentially habitable exoplanets is heating up. The JWST has successfully gathered some atmospheric spectra from exoplanet atmospheres, but it has a lot of other jobs to do and observing time is in high demand. A planned space telescope named LIFE is dedicated to finding exoplanet biosignatures, and recently, researchers gave it a test: can it detect Earth’s biosignatures?
As an interferometer, LIFE is made up of five separate telescopes that will work in unison to expand the telescope’s working size. LIFE is being developed by ETH Zurich (Federal Institute of Technology Zurich) in Switzerland. LIFE will observe in mid-infrared, where the spectral lines from the important bioindicative chemicals ozone, methane, and nitrous oxide can be found.
LIFE will be located at Lagrange Point 2, about 1.5 million km (1 million miles) away, where the JWST is also located. From that location, it’ll observe a list of exoplanet targets in hopes of finding biosignatures. “Our goal is to detect chemical compounds in the light spectrum that hint at life on the exoplanets,” explained Sascha Quanz, Professor for Exoplanets and Habitability at ETH Zurich, who is leading the LIFE initiative.
LIFE is still only a concept, and researchers wanted to test its performance. Since it hasn’t been built yet, a team of researchers used Earth’s atmosphere as a test case. They treated Earth as if it were an exoplanet and tested LIFE’s methods against Earth’s known atmospheric spectrum in different conditions. They used a tool called LIFEsim to work with the data. Researchers often use simulated data to test mission capabilities, but in this case, they used real data.
Their results are published in The Astronomical Journal. The research is titled “Large Interferometer For Exoplanets (LIFE). XII. The Detectability of Capstone Biosignatures in the Mid-infrared—Sniffing Exoplanetary Laughing Gas and Methylated Halogens.” The lead author is Dr. Daniel Angerhausen, an Astrophysicist and Astrobiologist at ETH in Zürich.
In a real-world scenario, Earth would be just a distant, nearly impossible to discern speck. All LIFE would see is the planet’s atmospheric spectrum, which would change over time depending on what views the telescope captured and, critically, for how long it observed it.
These spectra would be gathered over time, and that leads to an important question: how would the observational geometry and seasonal variations affect LIFE’s observations?
Fortunately for the research team, we have ample observations of Earth for them to work with. The researchers worked with three different observational geometries: two views from the poles and one from the equatorial region. From those three viewpoints, they worked with atmospheric data from January and July, which accounts for the largest seasonal variations.
Though planetary atmospheres can be extremely complex, astrobiologists focus on certain aspects to reveal a planet’s potential to host life. Of particular interest are the chemicals N20, CH3Cl, and CH3Br (nitrous oxide, chloromethane, and bromomethane), all of which can be produced biogenically. “We use a set of scenarios derived from chemical kinetics models that simulate the atmospheric response of varied levels of biogenic production of N2O, CH3Cl, and CH3Br in O2-rich terrestrial planet atmospheres to produce forward models for our LIFEsim observation simulator software,” the authors write.
In particular, the researchers wanted to know if LIFE will be able to detect CO2, water, ozone and methane on planet Earth from about 30 light years away. These are signs of a temperate, life-supporting world—especially ozone and methane, which are produced by life on Earth—so if LIFE can detect biological chemistry on Earth in this way, it can detect it on other worlds.
LIFE was able to detect CO2, water, ozone and methane on Earth. It also detected some surface conditions that indicate liquid water. Intriguingly, LIFE’s results didn’t depend on which angle Earth is viewed from. This is important since we don’t know what angles LIFE will be observing exoplanets from.
Seasonal fluctuations are the other issue, and they weren’t as easy to observe. But fortunately, it looks like that won’t be a limiting factor. “Even if atmospheric seasonality is not easily observed, our study demonstrates that next-generation space missions can assess whether nearby temperate terrestrial exoplanets are habitable or even inhabited,” said Quanz.
However, detecting the desired chemicals isn’t enough. The critical piece is how long it takes. Building a space interferometer that detected these chemicals but took too much time to do it wouldn’t be practical or effective. “We use the results to derive observation times needed for the detection of these scenarios and apply them to define science requirements for the mission,” the research team writes in their paper.
To paint a larger picture of LIFE’s observing times, the researchers developed a list of targets. They created a “… distance distribution of HZ planets with radii between 0.5 and 1.5 Earth radii around M and FGK-type stars within 20 pc of the Sun that are detectable with LIFE.” The data for these targets comes from NASA and from other previous research.
The results show that only a few days are needed for some targets, while for others, it could take up to 100 days to detect relevant abundances.
What the team calls “golden targets” are the easiest to observe. Planets in Proxima Centauri are an example of these types of targets. Only a few days of observation are needed for these planets. It’ll take about ten days of observations with LIFE to observe “certain standard scenarios such as temperate, terrestrial planets around M star hosts at five pc,” the researchers write. The most challenging cases that are still feasible are exoplanets that are Earth twins about 5 parsecs away. According to the results, LIFE needs between about 50 – 100 days of observing to detect the biosignatures.
LIFE is still just a potential mission at this point. It’s not the first proposed mission that would be solely focused on exoplanet habitability. In 2023, NASA proposed the Habitable Worlds Observatory (HWO). Its goal is to directly image at least 25 potentially habitable worlds and then search for biosignatures in their atmospheres.
But, according to the authors, their results show that LIFE is the best option.
“If there are late-type star exoplanetary systems in the solar neighbourhood with planets that exhibit global biospheres producing N2O and CH3X signals, LIFE will be the best-suited future mission to systematically search for and eventually detect them,” they conclude.
The post The LIFE Telescope Passed its First Test: It Detected Biosignatures on Earth. appeared first on Universe Today.
]]>In the past few decades, particle colliders have become a key tool for unraveling the mysteries of the universe at the fundamental level. The Large Hadron Collider (LHC), was a game changer and, with an amazing 27km circumference became the world’s most powerful collider. There are now plans to increase the number of collisions to try and improve its input to understanding the Universe but even with this ‘High Luminosity’ phase, CERN (European Council for Nuclear Research) wants to go even further and build a new collider!
If colliders like LHC are to play a part in high energy physics over the coming years then energy thresholds need to pushed beyond current capabilities. The Future Circular Collider (FCC) study has looked into various collider designs, envisaging a research infrastructure housed within a 100km underground tunnel. This ambitious project is promising a physics program that will take high energy research into the next century.
There are a number of challenges that face the design and engineering of the new tunnel however; it must steer clear of geologically interesting areas, optimise future collider efficiency, allow for connectivity with the LHC, and adhere to social and environmental impacts of the surface buildings and infrastructure. Choosing ‘where to put it’ seems to be quite the challenge so a range of layout options are being considered, guided by CERN’s intent to avoid the impact on the area.
Within the FCC tunnel (which looks like it will be placed beneath ring-shaped underground tunnel located beneath Haute-Savoie and Ain in France and Geneva in Switzerland) will be two colliders that will work together sequentially. The first phase is scheduled for inauguration around mid-2040s and comprises an electron-positron collider (FCC-ee). The hope is that it will give unparalleled precision measurements and unveil physics beyond the standard model. Following hot on its heels will be the proton-proton collider (FCC-hh) which will surpass the energy capability of LHC eightfold!
It’s an exciting prospect that FCC will push particle collision to energies of 100 TeV in the hope of uncovering new realms of physics. To achieve the goal however, new technological advances will be required and to that end, over 150 universities from around the world are exploring the options.
Source : Feasibility Study into new Super Collider
The post CERN Wants to Build an Enormous New Atom Smasher: the Future Circular Collider appeared first on Universe Today.
]]>Today, ESA’s space telescope Euclid begins its survey of the dark Universe. Over the next six years, Euclid will observe billions of galaxies across 10 billion years of cosmic history. Learn how the team prepared Euclid in the months after launch for this gigantic cosmic quest.
]]>Now, astronomers have discovered that some FFPs can orbit each other in binary relationships as if swapping their star for another rogue planet.
In 2023, astronomers working with the James Webb Space Telescope (JWST) detected 42 JuMBOs in the inner Orion Nebula and the Trapezium Cluster. JuMBOs are different than other free-floating planets. They’re Jupiter-Mass Binary ObjectS.
In that research, the JWST performed a near-infrared survey of the region with its powerful NIRCam. It looked at powerful outflows and jets from young stars, ionized circumstellar disks, and other objects in the region. Among the findings were the 42 JuMBOs. “Further papers will examine those discoveries and others in more detail,” the authors of that paper wrote.
That’s exactly what’s happened. New research published in The Astrophysical Journal Letters examines one of the JuMBOs in more detail. But instead of infrared observations, the authors used observations from the Karl G. Jansky Very Large Array (VLA) to examine the objects in radio emissions.
The paper is “A Radio Counterpart to a Jupiter-mass Binary Object in Orion.” The lead author is Luis Rodriguez, a researcher at the Instituto de Radioastronomía y Astrofísica, Universidad Nacional Autónoma de México.
“The existence of these wide free-floating planetary-mass binaries was unexpected in our current theories of star and planet formation,” Rodriguez and his colleagues write in their paper. “These systems are not associated with stars, and their components have masses of giant Jupiter-like planets and separations in the plane of the sky of order about 100 au.”
Our understanding of planets and how they form starts with stars. Stars form in giant molecular clouds, and as they form, a rotating disk of gas and dust forms around the star. Planets form in these disks, and they take up residence in orbit around the star.
But rogue planets, also called Isolated Planetary Mass Objects (IPMOs), can form differently. Currently, there are two competing explanations for their formation. They may form around stars as described above, or they may form in isolation like low-mass stars and brown dwarfs do.
The JuMBOs range from 0.6–14 Jupiter masses, and they’re between 28 and 384 AU apart. There’s currently no explanation for how these binary objects can form. Solitary rogue planets are compatible with our understanding of how stars and planetary systems form. But JuMBOs don’t fit inside that understanding.
These objects have things in common with brown dwarfs, sub-stellar objects more massive than the largest planets yet too small to trigger fusion. Brown dwarfs can be found at wide separations in binary pairs. Astronomers found one brown dwarf pair separated by 240 AU, and there are likely more widely separated brown dwarf binaries yet to be discovered.
In this paper, the researchers examined one particular JuMBO from the previous study called JuMBO 24. They looked at VLA observations that spanned a decade and found that JuMBO 24 was far brighter in radio luminosity than brown dwarfs.
The research team naturally wondered if the radio sources they detected were coming from JuMBO 24. By working their way through the data, they concluded that it’s highly unlikely that the radio emissions are coming from a source other than JuMBO 24. The odds of the radio emissions and the infrared emissions detected by JWST coming from separate sources is only 1 in 10,000, according to the researchers.
The objects most similar to JuMBOs in terms of their radio emissions are the ultracool dwarfs. But JuMBO 24 doesn’t exhibit the same patterns in radio emissions that ultracool dwarfs do. “The radio emission appears to be steady at a level of about 50 millijanskys over timescales of days and years,” the authors point out, while emissions from ultracool dwarfs have greater variability. That means that JuMBO 24 is the first detected centimetre continuum source with a planetary-mass binary object.
“The radio emission is marginally resolved in the same direction as the infrared source detected by JWST, suggesting that the radio emission comes from a combination of the two planetary-mass objects,” the researchers write in their conclusion. For now, the mechanism responsible for the radio emissions is a mystery. “Additional radio observations are necessary to pin down the nature of the radio emission mechanism,” the team concludes.
For lead author Rodriguez, there’s more to this discovery than just the unexplained radio emissions and what the discovery means for our understanding of how planets can form. These binary planets on ultra-wide orbits around each other could host moons that are potential abodes of life.
“What’s truly remarkable is that these objects could have moons similar to Europa or Enceladus, both of which have underground oceans of liquid water that could support life,” he stated.
The post Radio Telescope Confirms Free-Floating Binary Planets in the Orion Nebula appeared first on Universe Today.
]]>Detecting biosignatures in the atmospheres of distant planets is fraught with difficulties. They don’t advertise their presence, and the signals we receive from exoplanet atmospheres are complicated. New research adds another complication to the effort. It says that lightning can mask the presence of things like ozone, an indication that complex life could exist on a planet. It can also amplify the presence of compounds like methane, which is considered to be a promising biosignature.
The new research is “The effect of lightning on the atmospheric chemistry of exoplanets and potential biosignatures,” and it’s been accepted for publication in the journal Astronomy and Astrophysics. The lead author is Patrick Barth, a researcher from the Space Research Institute at the Austrian Academy of Sciences.
While we’ve discovered over 5,500 exoplanets, only 69 of them are in the potentially habitable zones around their stars. They’re rocky planets that receive enough energy from their stars to potentially maintain liquid water on their surfaces. Our search for biosignatures is focused on this small number of planets.
The important next step is to determine if these planets have atmospheres and then what the composition of those atmospheres is. The JWST is our most powerful instrument for these purposes. But in order to understand what the JWST shows us in distant atmospheres, we have to know what its signals tell us. Research like this helps scientists prepare for the JWST’s observations by alerting them to potential false positives and masked biosignatures.
In their research, the authors combined laboratory experiments with photochemical and radiative transfer modelling. Atmospheres can be extraordinarily complex, and no two exoplanets are likely to have the same atmospheric qualities. But physics and chemistry dictate what can happen, and photochemical and radiative transfer models can handle thousands of different types of chemical reactions in atmospheres.
In the laboratory experiments, spark discharge stood in for lightning. The researchers focused on atmospheres containing N2, CO2, and H2 and the different products the lightning produced. Other research has done the same, but this work is different. Previous research focused on either individual products or only a small number of products. But Barth and his colleagues expanded on that work. They studied the production of a wider variety of chemicals.
That allowed them to “… investigate trends in our experiments concerning the oxidation state of lightning products and the influence of water vapour,” they explain. “In particular, we were interested in the effect of lightning on the production of potential (anti-)biosignatures in the context of current and upcoming
observations of exoplanetary atmospheres.”
The researchers found that the effect of lightning on biosignatures depends on the type of atmosphere and the amount of lightning. They looked at two broad types of atmospheres: reducing and oxidizing. A reducing atmosphere has no oxygen or other oxidizing gases and can’t produce any oxidized compounds. An oxidizing atmosphere is the opposite. It does contain oxygen, which produces oxidized compounds.
Their results show that for a planet with surface water and habitable conditions with a slightly reducing or slightly oxidizing atmosphere, lightning is less likely to produce false positives. The authors predict that “… for the kind of atmospheres studied here, lightning is not able to produce a false-positive NH3 or CH4
biosignature.” They say that it’s also unlikely that lightning could produce a false positive N2O biosignature.
But the lightning produced some compounds, including CO and NO. The researchers used the rates of production of both chemicals to calculate how lightning flash rates affect the atmosphere’s chemical makeup. Next, they applied that model to Earth-sized planets in the habitable zones of the Sun and TRAPPIST-1 for both oxic and anoxic atmospheres. They conducted simulations of those scenarios on planets with and without biospheres. They also calculated the simulated spectra from those worlds to identify chemical signatures.
Their results?
“We find that lightning is not able to produce a false-positive CO anti-biosignature on an inhabited planet,” the authors explain. “In an oxygen-rich atmosphere, however, lightning rates only a few times
higher than modern Earth’s can mask the O3 <ozone> biosignature.”
But in other situations, lightning can prevent false positives. In an anoxic atmosphere of a planet orbiting an old red dwarf, lightning more frequent than Earth’s can remove one type of confounding false positive.
“Similarly, in an anoxic, abiotic atmosphere of a planet orbiting a late M dwarf, lightning at flash rates ten times or more than that of modern Earth can remove the abiotic ozone feature produced by CO2 photolysis, preventing a false-positive biosignature detection,” they explain. To say it’s complicated is an understatement.
There’s yet another twist. Lightning may not prevent other important false positives. “… lightning might
not be able to prevent all false-positive O2 scenarios for CO2-rich terrestrial planets orbiting ultracool M dwarfs,” the authors write.
Those with an eye for irony might notice some here. Scientists are pretty sure that lightning played a role in life on Earth by providing the energetic spark that got the ball rolling. But the fact that lightning could also make it more difficult for us to discover life is somewhat ironic.
But irony is a human contrivance. Nature doesn’t care. It does what it does, and it’s up to us to figure it out.
“In summary, our work provides new constraints for the full characterization of atmospheric and surface processes on exoplanets,” the authors conclude.
The post If Exoplanets Have Lightning, it’ll Complicate the Search for Life appeared first on Universe Today.
]]>If you think the three body problem is a headache, just wait till you try to wrap your math around a collision between four galaxies that has produced three active galactic nuclei. Say hello to Arp–Madore 2339-661. Here’s more from Astronomy Now:
]]>All four galaxies, which are 500 million light years away, are interacting and will eventually merge to form one giant elliptical galaxy. In 2021 a study published in the journal Astronomy and Astrophysics, using data from the Very Large Telescope’s MUSE (Multi Unit Spectroscopic Explorer) instrument and near-infrared observations from the South African Astronomical Observatory, indicated that NGC 7733b contains an active galactic nucleus (AGN) called a Low Ionisation Nuclear Emission-line Region (LINER), while both NGC 7734 and NGC 7733N contain Seyfert AGN. That makes this system, collectively known as Arp–Madore 2339-661, a rare triple AGN. When the galaxies all merge, the supermassive black holes at the heart of these AGN will also merge.
The first radio transmissions were made in 1895 and since then the signals, however weak have been leaking out into space. The first intentional transmission out into space was the Arecibo message of 1974 that was sent toward the globular cluster M13 22,180 light years away. That means the signal won’t arrive there for about another 22,131 years! During this time of course, all the signals have been leaking out but the further they travel, the weaker they get. Its likely then that any signals out to a distance of about 100 light years is likely to be so weak as to not be detectable.
It would be so easy to be dragged into other areas of debate about aliens but it feels useful to set the scene of how difficult it will be to make contact or rather, how likely it may be. Assume then, that in some way, we do find ourselves making communication with an alien civilisation. Just how that conversation goes has been modelled by a team led by Mingyu Jin from Northwestern University.
The team used a new artificial intelligence framework known as CosmoAgent to simulate the interaction based upon the unique Large Language Model (LLM). The system uses a Multi-Agent System to enable modelling among a diverse range of civilisations. The civilisations have the ability to choose their own character traits from hiding, fighting or collaborating. This dynamic environment allows for a plethora of outcomes from alliances forming, adherence to rules to rivalries to how a civilisation might respond to an unforeseen event.
Diversity and conditions for life were also inherent in the modelling using transition matrices to analyse how civilisations might grow and change over time. This natural progression of an intelligent life form would inevitably mean ethics, morals, beliefs and sciences would develop along a varied path. These different frameworks would hugely effect just how such a civilisation might respond to alien contact.
There are limitations to the research though, largely from an Earth-centric bias developing the language model. The use of mathematics and algorithms to compute responses and outcomes may not cover the full spectrum of inter-civilisation responses. After all, we cannot even distill our own emotional responses down to a set of algorithms. Add in a speculative set of principles of an alien civilisation, of which, we have no evidence or experience to draw upon.
It is hoped that future research can address these obstacles and develop better models of inter-civilisational interaction. Taking into account a broader range of ethical paradigms and decision making processes to provide a more realistic simulation of just how first contact may just play out.
Source : What if LLMs Have Different World Views: Simulating Alien Civilizations with LLM-based Agents
The post An AI Simulated Interactions Between Different Kinds of Advanced Civilizations appeared first on Universe Today.
]]>The Harbin Institute of Technology and the China Aerospace Science and Technology Corporation developed the simulator as part of China’s first large scale scientific facility. It’s official name is the Space Environment Simulation and Research Infrastructure facility, or SESRI for short and it will provide focus to explore the environments of space with focus on space craft and life forms and also on plasma (charged gas) interactions.
The facility covers an area the size of 50 soccer fields, has four main laboratories and has the ability to tailor the environmental conditions based on research requirements. Each one covers a different aspect of space exploration for example the Lunar Dust Simulation chamber studies the impact of dust on spacecraft, astronauts and their spacesuits. Any space faring person or craft is subjected to extreme temperature fluctuations, to elevated levels of charged particles and electromagnetic radiation and to higher levels of space dust and all of these are adjustable with the simulator.
Some experiments that previously required time in space will no longer have to be launched and can be completed on the ground in a far more controlled, safer and even cheaper environment. Deputy Commander in Chief of the project Li Liyi even mused that it was akin to bringing the space station to Earth. In addition to offering and simulating the environment to test space craft, it will also allow for agricultural breeding and life science experiments to explore humans reaction and interaction to long term colonies on other planets.
The official opening came after 18 years of work from start to finish and hopes to establish China as one of the world’s main aerospace powers. It has already received interest from as many as 110 universities and institutes from over 30 countries.
SESRI holds great importance to China in facilitating scientific and technological breakthroughs that can span across technologies, sciences and even industries. But China’s aspiration’s don’t stop there. They hope it will help to unravel some of the mysteries of the universe and reveal scientific laws that govern the cosmos we see today.
Source : Nation opens first simulated environment for space research
The post China Has Built a Huge Space Simulation Chamber appeared first on Universe Today.
]]>In this post I will examine some recordings of the S-band telemetry signal done by AMSAT-DL with the 20 metre antenna in Bochum observatory. These recordings were done while the lander was still in-orbit. When landed on the Moon, IM-1 used the same configuration, but the recordings done at Bochum are probably too weak to decode, due to the orientation of the lander antennas.
I will look at two recordings in this post:
2024-02-15_17-11-34_288000SPS_2210570000Hz.s16
2024-02-22_20-10-59_144000SPS_2210600000Hz.s16
The spacecraft is using different configurations in the two recordings. When landed, IM-1 used the same configuration as in the recording from February 22.
According to FCC paperwork, IM-1 has two Thales Alenia transceivers with 5 dBic low-gain antennas and 8 W output power, and two Quasonix transmitters with a 15 dBic high-gain antenna and 25 W output power for high-speed data. These recordings are likely from the Thales radios, although IM-1 also used the Quasonix transmitter with the low-gain antennas for low-speed data at the end of the mission to improve its link budget.
In the first recording, the modulation and coding is residual-carrier PCM/PSK/PM with a baudrate of approximately 2511.5 baud and a subcarrier frequency of 12 times the baudrate, which gives 30.138 kHz. In the second recording, the modulation is suppressed-carrier BPSK, also with a baudrate of approximately 2511.5 baud. The coding is the same in both configurations: CCSDS k=7, r=1/2 convolutional coding, with 153-byte AOS frames.
Here is the flowgraph that I’ve used to decode the signal in the first recording. It is a typical demodulator for PCM/PSK/PM, followed by Viterbi decoders and a Sync and create PDU blocks to find the ASM and extract the packets. Two Viterbi decoders are used in parallel to test both possible pairings of BPSK symbols, and each of these feeds two Sync and create PDU blocks that test the two possible 180 degrees phase ambiguities.
This is how the GUI of the flowgraph looks like when running on the recording. The SNR is excellent, and the constellation has no bit errors.
The following shows the flowgraph for decoding the suppressed-carrier BPSK telemetry. Since there is Doppler drift on the signal, the FLL Band-Edge block is used. There is something interesting about the demodulator, which is that the rectangular pulse filter is using only a 1/2-symbol window, instead of the 1-symbol window which is more common. The reason is that using a 1-symbol window gives a lot of inter-symbol interference. It seems that the transmitter is using some form of triangular pulse shape, instead of the rectangular pulse shape which is normally used in deep space missions.
Here is the GUI of this flowgraph running on the recording from February 22. The SNR is again quite good. I have selected the part of the recording in which there is better SNR, as there are large changes in signal strength in this recording. The bottom panel shows the time-domain waveform. It is clear from this that the transmitter pulse shape is not rectangular.
The frames are 153-byte AOS frames. There is a Frame Error Control Field (CRC-16) which is checked by the GNU Radio flowgraph. Only virtual channel 1 is in use. The spacecraft ID is 0xCE
. This does not appear in the SANA registry.
The following figure shows a raster map of all the AOS frames decoded from the first recording. The FECF has been removed from these frames already. There are two zones which are always zero. These show as dark purple bands in the plot. The first zone is the space where there would usually be an M_PDU header. All zeros is a valid value for an M_PDU header, which indicates that the first packet starts at the beginning of the packet zone (first header pointer equal to zero). The second zone is the space where there can be an Operational Control Field carrying a CLCW. In this case the CLCW is probably missing, since CLCWs typically have some non-zero bits.
The payload in the frames looks quite random. In fact, the FCC documentation says that the data is encrypted. However, there are some patterns also. These probably correspond to padding data used to fill up the rest of the AOS frame when there isn’t more data to send immediately.
The raster map of the frames decoded in the second recording looks very similar.
The GNU Radio flowgraphs and the Jupyter notebook used in this post, as well as binary files containing the decoded frames, can be found in this repository.
]]>Like Frodo, you probably don’t want anything to do with a magic ring infused with a profound evil that will slowly works its dark will into your body and soul, driving you to ever worsening acts of selfishness and cruelty. Unlike Frodo, you may be able to have an open source version of a magic ring that you can make as good or evil as you’d like! Many companies offer smart rings. Here’s an open source version that may not need to be carried all the way to Mount Doom. Here’s more from hackster:
]]>A new option has recently been developed by a team of engineers at The Pennsylvania State University that is both open source and compact. Called OmniRing, this smart ring platform has a miniature form factor, long battery life, sensing and processing units, and is water resistant, making it practical for daily use. It was designed with finger motion analytics and healthcare applications in mind, but the open design allows each user to customize it as much as they wish. And with the total cost coming in at less than $25 wholesale, or about $62 for a single unit, OmniRing is accessible to just about everyone.
The ring is composed of a flexible printed circuit board with an inertial measurement unit (IMU) and photoplethysmography (PPG) sensors. The Nordic Semiconductor nRF52832 system on a chip, with an Arm Cortex-M4 processor running at 64 MHz was selected to handle onboard processing. Since this chip also supports wireless protocols like Bluetooth Low Energy, it can transmit sensor readings to external devices for more complex analyses. The components are housed in a waterproof 3D printed case that was designed using a combination of resin and thermoplastic polyurethane. A 3.7 volt ring-shaped rechargeable LiPo battery, which can operate for up to a week between charges, completes the design of the 2.5 gram OmniRing.
NASA and Intuitive Machines co-hosted a news conference on Feb. 28 to provide a status update on the six NASA instruments that collected data on the IM-1 mission.
Mission challenges and successes were discussed during the briefing including more than 350 megabits of science data downloaded ready for analysis. During transit, all powered NASA payloads operated and received data. During descent and landing, guidance and navigation data was collected that will help improve landing precision in the future, and all three payloads that were designed to operate on the surface have received data.
The first images from the lunar surface are now available and showcase the orientation of the lander along with a view of the South Pole region on the Moon. Intuitive Machines believes the two actions captured in one of their images enabled Odysseus to gently lean into the lunar surface, preserving the ability to return scientific data. After successfully transmitting the image to Earth, there is additional insight into Odysseus’ position on the lunar surface.
On Feb. 22, NASA science instruments and technology on board Intuitive Machines’ Nova-C lander, called Odysseus, landed on the Moon’s South Pole region, marking the United States’ first return since Apollo 17. This was also the first landing as part of the agency’s CLPS (Commercial Lunar Payload Services) initiative — transmitting valuable science data of each NASA payload from the lunar surface.
Additional updates can be found by watching the news conference here.
]]>NASA’s SPHEREx (Spectro-Photometer for the History of the Universe, Epoch of Reionization, and Ices Explorer) mission will launch no later than April 2025. The orbiting telescope will conduct a two-year all-sky survey in optical and infrared light. The main focus of the mission is to gather data on more than 300 million galaxies and 100 million stars in the Milky Way. But SPHEREx will also add to our knowledge of Potentially Hazardous Objects (PHOs).
A new paper examines SPHEREx’s capabilities and how the mission can contribute to Planetary Defense (PD.) Its title is “Planetary Defense Use of the SPHEREx Solar System Object Catalog.” It’s currently in pre-print, and the lead author is Carey Lisse from the Space Exploration Sector at the Johns Hopkins University Applied Physics Laboratory.
SPHEREx “provides a unique space-based opportunity to detect, spectrally categorize, and catalogue
hundreds of thousands of solar system objects at NEOWISE sensitivities,” the authors write. NEOWISE is NASA’s successful asteroid-finding mission that just reached ten years of operation and has found over 3,000 NEOs (Near-Earth Objects). “By leveraging SPHEREx data, scientists and decision-makers can enhance our ability to track and characterize PHOs, ultimately contributing to the protection of our planet,” the authors of the new paper explain.
Among the many calamities that have struck life on Earth, asteroid impacts are the most dramatic. About 66 million years ago, an asteroid struck Earth and wiped out the dinosaurs. That asteroid was about 10 km in diameter and wreaked havoc on Earth’s biosphere at the time. The odds of another asteroid strike are never zero, and less massive impactors could still alter civilization forever. It could cause unimaginable suffering and strife.
While some researchers are working on ways to destroy or deflect PHOs, others are working on cataloguing as many of them as they can. This is where SPHEREx comes in.
SPHEREx will follow the same type of orbit that NEOWISE does. It’s called a sun-synchronous polar orbit, and it means that the observatory will collect data from both the leading and trailing directions. That will allow SPHEREx to cover the range of latitudes in the entire sky every six months and to cover the ecliptic poles in each orbit.
SPHEREx was built to address three main science goals: Measuring the Anisotropy of Cosmic Inflation,
Determining the History of Galaxy Formation, and Surveying Ices in Molecular Clouds. Detecting PHOs is its side hustle. But its powerful infrared capabilities mean it’ll do more than just detect them.
When it comes to asteroid detection, we’re in a race against time. The pace may be slow, but it’s still a race and one we can win. Time may be on our side.
PHOs are defined as objects that come with 0.05 AU of Earth and have a magnitude of 22 or less. These objects are close enough to pose an impact risk and large enough to be catastrophic if they do strike Earth. Magnitude 22 corresponds to an object with an albedo of 0.14 and a size of about 140 meters. Though much, much less massive than the dinosaur-killing Chicxulub impactor, these objects can still cause widespread damage.
Scientists predict that one of the impactors should strike Earth every few ten thousand years. As a result, Congress instructed NASA to detect 90% of these NEOs. NASA’s made lots of progress, and with the commissioning of the Legacy Survey of Space and Time, they’ll likely reach the 90% goal in less than a decade.
But SPHEREx will do more than detect PHOs, NEOs and NEAs. It will reveal crucial information that will allow us to prepare for their approach. “Accurate spectral categorization of NEOs is a key factor in assessing the threat from a potential impactor as well as developing effective mitigation strategies,” the researchers explain. “Succinctly, whether the impactor is made of rock, metal, or an icy organic mix is critical to know before one attempts to terminate the hazard (“know thy impactor”), and this determination is typically made using near-infrared spectrophotometry.”
The observatory will acquire millions of exposures of the sky, and they’ll be in 102 visual and infrared wavelengths. Some wavelengths will span the same range as NEOWISE but in 40 discrete channels rather than NEOWISE’s two channels. SPHEREx’s observations will also feature an additional 62 spectral channels beyond NEOWISE’s coverage. What does that add up to?
“SPHEREx measurements will be uniquely useful for spectral typing, quick object compositional characterization, population context, size/albedo determination, and temporal trending of objects in the
current epoch,” the authors explain. Spectral type, rotation states, albedo, and size are key factors in building up our planetary defence against asteroids and comets.
SPHEREx is an important step in safeguarding our home in the Solar System as best we can. Nature can throw a lot of powerful, vexing curveballs, and NASA’s efforts to detect them is foundational to developing ways to protect Earth.
SPHEREx will do more than find PHOs, and by characterizing them, its data could be the key to effective mitigation.
The post A New Space Telescope will Map the Universe and Help Protect the Earth from Asteroids appeared first on Universe Today.
]]>Statistik: Verfasst von Mike Olason — 25. Februar 2024, 00:42:07 AM
Originally published 01/29/2024 IntraVision Group is using NASA plant-growth research to take vertical farming to new heights. Twenty- and 30-foot towers of hydroponic trays include customized LED lighting, nutrients, water filtration, air flow, and C02 levels to meet crop needs. It’ll be years before astronauts living beyond the reach of
The post NASA data and expertise helps controlled environment agriculture reach new heights appeared first on Association for Vertical Farming.
]]>Sleeping bookshelf hideaway in a NYC rooftop hideaway.
]]>The post Scientists Spot the Brightest Object Ever–500 Trillion Times More LuminousThan the Sun appeared first on Good News Network.
]]>For decades, astronomers have been puzzled by the detection of random, mysterious radio bursts that seemingly originate from deep space. Recently, a team announced the detection of another one of these puzzling radio bursts, which was observed by two NASA X-ray telescopes — the Neutron Star Interior Composition Explorer (NICER) and Nuclear Spectroscopic Telescope Array (NuSTAR) telescopes, which both operate from low-Earth orbit.
The radio burst observed by NICER and NuSTAR is called a “fast radio burst” and was detected by both telescopes just minutes before and minutes after the event occurred. The detection of the burst by both telescopes at the same time has provided scientists with unprecedented amounts of data, allowing them to zoom into the radio burst and gain a better understanding of the extreme nature of the phenomenon.
Even though these radio bursts are thought to occur in deep space, these events release extremely high amounts of energy, with fast radio bursts specifically releasing as much energy as the Sun does in an entire year in the span of just a fraction of a second.
However, the quick nature of the bursts makes their origin very difficult to pinpoint for scientists. Throughout the last few decades, all detected bursts were traced to some location outside of the Milky Way, meaning that whatever created the bursts was too far away for astronomers to decipher. However, a fast radio burst that occurred on April 28, 2020, was found to have occurred within the Milky Way.
Astronomers eventually traced the burst’s origin to an extraordinarily dense object called a magnetar, which is a type of neutron star. Neutron stars are the left-over remnants of exploded stars and represent the collapsed core of the exploded star. Following the 2020 burst, scientists were able to learn more about the nature of radio bursts and magnetars, which allowed them to further characterize radio bursts that originated from outside the Milky Way.
Interestingly, in October 2022, the same magnetar — named SGR 1935+2154 — emitted another fast radio burst. At the moment of the burst, NASA’s NICER and NuSTAR happened to be observing SGR 1935+2154 at the same time, with both telescopes having already observed the magnetar for several hours. Following the detection of the burst, the telescopes continued to collect data on SGR 1935+2154 in order to catch a glimpse of what happens within and around a magnetar when a radio burst is emitted, as well as what happens just before and just after a burst has been emitted.
The 2022 SGR 1935+2154 burst occurred between two periods of time wherein the magnetar’s rotational rate began to rapidly increase. Magnetars are small cosmic objects that rotate extremely fast, with SGR 1935+2154 measuring just 20 kilometers in diameter and rotating approximately 3.2 times per second — meaning that the surface of SGR 1935+2154 rotates at approximately 11,000 kilometers per hour. To slow or speed up such a rapidly rotating object would require an immense amount of energy — but SGR 1935+2154 didn’t act this way.
The team of researchers analyzing the data from the event noted that the magnetar slowed down between the two periods of increased rotation (called “glitches”), decreasing its rotational speed in just nine hours. This observation surprised the scientists, as the sudden decrease in rotational speed occurred around 100 times faster than has ever been recorded.
“Typically, when glitches happen, it takes the magnetar weeks or months to get back to its normal speed. So clearly, things are happening with these objects on much shorter time scales than we previously thought, and that might be related to how fast radio bursts are generated,” said lead author and astrophysicist Chin-Ping Hu of the National Changhua University of Education in Taiwan.
Artist’s concept of a radio burst erupting from a magnetar. (Credit: NASA’s Goddard Space Flight Center/Chris Smith (USRA))
While the glitches may explain the production of radio bursts from magnetars, understanding exactly how magnetars produce radio bursts is difficult, and scientists have to consider tens or hundreds of different variables to confirm a hypothesis.
One such variable is gravity. Neutron stars and magnetars are among the most dense cosmic objects known to exist in the universe, with a teaspoon of their surface material massing about a billion tons on Earth. Given that the gravitational force of a cosmic object is directly related to that object’s density, the gravitational field of a neutron star and magnetar is extraordinarily strong. For example, a marshmallow falling onto the surface of a neutron star would create the same force as an atomic bomb upon impact with the surface.
This extreme gravitational force means that the surface of a magnetar is very volatile, as it regularly releases large bursts of X-rays and other forms of high-energy light. The NICER and NuSTAR data from the 2022 SGR 1935+2154 burst showed that there was an increase in the amount of X-ray and high-energy light eruptions from the magnetar in the time leading up to the radio burst. This increase in high-energy light around SGR 1935+2154 is actually what led NICER and NuSTAR teams to orient their spacecraft in the direction of the magnetar.
Rendering of NuSTAR. (Credit: NASA)
“All those X-ray bursts that happened before this glitch would have had, in principle, enough energy to create a fast radio burst, but they didn’t. So it seems like something changed during the slowdown period, creating the right set of conditions,” said co-author and research scientist of the University of Maryland and NASA’s Goddard Space Flight Center in Maryland.
However, what else could explain the 2022 burst from SGR 1935+2154 and the emissions of high-energy light before the burst?
One potential explanation could be that since the exterior of a magnetar is solid, the extreme density of the magnetar would for the interior of the star to become a superfluid. If this is the case, then the exterior and interior of SGR 1935+2154 would live in a delicate balance, wherein the internal superfluid can deliver immense eruptions of energy to the surface of the magnetar — eventually creating fast radio bursts. Hu et al. believe that this is what likely created both of the glitches that occurred before and after the radio burst.
If the first of the two glitches, occurring before the eruption of the burst, created a sharp crack in the surface of the magnetar due to the immense forces of its fluctuating rotational rate, then SGR 1935+2154 likely would’ve released material from the interior of the star into space — creating a radio burst. In physics, when a spinning object loses mass, then its rotational rate will slow down. Thus, scientists believe that the eruption of internal material into space — the fast radio burst — may have been what caused the rapid decrease in rotational rate between the two glitches.
What’s causing mysterious bursts of radio waves from deep space?
Astronomers may be a step closer to an answer after using two X-ray telescopes to zoom in on a dead star’s erratic behavior. Learn how we're breaking down the data. https://t.co/b2E5EWebie
— NASA JPL (@NASAJPL) February 14, 2024
Nonetheless, the teams have only been able to observe one of these events in real time. As such, Hu et al. still can’t fully determine with confidence which of the aforementioned factors (as well as others like a magnetar’s complex magnetic field) specifically leads to the creation of fast radio bursts. Some of these factors may have absolutely nothing to do with radio bursts and their creation.
“We’ve unquestionably observed something important for our understanding of fast radio bursts. But I think we still need a lot more data to complete the mystery,” said researcher George Younes at Goddard and a member of a NICER team that specializes in magnetars.
Hu et al.’s results were published on Feb. 14 in the journal Nature.
(Credit: Artist’s depiction of a radio burst erupting from a magnetar. Credit: NASA/JPL-Caltech)
The post NuSTAR and NICER observe same radio burst, provide hints into nature of phenomenon appeared first on NASASpaceFlight.com.
]]>The recent increase in launches to meet growing demands for communications and observation satellites is adding more and more spacecraft to orbit every year. This is especially true in the more congested low-Earth orbit (LEO).
With a sharp focus on making space safe and sustainable for the current as well as future generations, organizations such as Astroscale are responding to an emerging market of commercial on-orbit services for active and defunct satellites in orbit. These plans go beyond just maintenance and lifespan extensions, including the refueling of spacecraft as well as the active removal of space debris.
Headquartered in Japan, Astroscale is one of several private companies innovating in this space. Its Active Debris Removal by Astroscale-Japan (ADRAS-J) surveying mission, which launched on a Rocket Lab Electron rocket last week, aims to demonstrate the necessary operations and technology that will underpin the delivery of new on-orbit services.
Space debris, more colloquially referred to as “space junk,” can take several forms, ranging from satellites that have reached the end of their operational life to discarded rocket stages that have performed their duties. Some rocket stages, including those from twentieth-century missions, have been discarded into a “graveyard orbit” that sits at least 300 kilometers above geostationary orbit (GEO) at 35,800 kilometers. The industry’s attention is, however, on the busier LEO.
The ADRAS-J spacecraft approaches the unresponsive upper stage of an H-IIA rocket. (Credit: Astroscale)
Different factors contribute to orbital decay, including the orbital altitude, atmospheric drag, the ballistic coefficient of the object, and even the “weather” in space. For this reason, spacecraft maintain a supply of propellant for periodic station-keeping. The decay period increases exponentially with higher altitudes, with decay times surpassing 10 years as you go beyond 500 kilometers.
Satellites in LEO typically have a five-year lifespan, while those in GEO often operate for up to 15 years. In late 2022, the Federal Communications Commission (FCC) took steps to mitigate the growing debris within 2,000 kilometers of Earth by lowering the timeframe from 25 to five years in which operators of FCC-licensed satellites should deorbit their satellites after mission completion.
The FCC issued a landmark $150,000 fine to the Dish Network satellite company in October 2023 for failing to dispose of its EchoStar-7 into a higher “graveyard orbit” at the end of its lifespan in GEO. The satellite achieved less than half of the required 300-kilometer raise due to insufficient propellant amounts.
Modern launch vehicle operators are acutely aware of the importance of debris mitigation and will precisely control the deorbit of their spent stages. For example, Falcon 9 second stages perform retrograde burns on many of its missions to safely deorbit after payload deployment. Similarly, Rocket Lab’s Electron kick-stage sometimes fires its Curie engine one final time to deorbit itself while the second stage reduces its own altitude so that nothing is left in orbit except the deployed satellites.
Starlink satellites will use their ion-powered ion thrusters to lower their orbit at the end of their useful life, and at altitudes below 600 kilometers, the satellites will leave no persistent debris, even if a satellite fails while in orbit. When the Perigrine-1 mission suffered an anomaly earlier this year, Astrobotic made the decision to preserve cislunar space from potential debris and utilized the lander’s remaining propellant to target a path in which the craft would re-enter Earth’s atmosphere and subsequently burn up and disintegrate.
Infographic showing orbital debris decay. (Credit: ULA)
Despite these responsible mitigation steps, LEO remains populated with hundreds of thousands of pieces of debris, including end-of-life spacecraft which have run out of power and can neither communicate nor reorient themselves. These objects are the focus of the ADRAS-J mission.
ADRAS-J Mission
Rocket Lab’s ‘On Closer Inspection’ mission launched atop an Electron rocket from Launch Complex 1 in Mahia, New Zealand, on Feb. 19 at 3:52 AM NZDT (14:52 UTC on Feb. 18), carrying the ADRAS-J, which masses 180 kilograms.
The goal of the mission is to safely approach, characterize, and fly an observational orbit path around a large uncommunicative piece of space debris in LEO — the upper stage of a discarded Japanese H-IIA rocket.
Other demonstration missions similar to ADRAS-J have previously either deployed a target object as part of the demonstration or were able to communicate with it. The difference between previous demonstration missions and ADRAS-J is the non-communicative, non-powered, and uncontrollable nature of the rocket stage.
With no GPS or other data being transmitted from the object, the initial identification and approach will be based on limited ground-based data. After this, onboard visual and other sensors will locate the stage and determine the relative distance and altitude of the stage before performing a safe approach.
ADRAS-J craft approaches the discarded upper stage of an H-IIA rocket. (Credit: Astroscale)
ADRAS-J will approach at around 600 kilometers altitude using a series of corkscrew-style “safety ellipse” maneuvers. Once near the stage, it will execute a series of “Rendezvous and Proximity Operations,” which demonstrate the technology needed for a precise rendezvous with a large target object.
A further fly-around phase will then determine the target’s spin rate and axis before ADRAS-J completes its objectives by settling into a stable position a short distance away from the stage, aligned with the stage’s orientation.
Different navigational techniques and sensors are used to progressively approach the target object; VISCam (smart image processing, enabling object location detection), IRCam (using infrared), and LIDAR (using light in the form of a pulsed laser to measure distances). The latter is used to make a final approach to align with the object and demonstrate that the craft could be held at a maintained close distance.
The inspection stage enables images to be captured every 30 seconds, and the data collected by ADRAS-J will directly inform Astroscale’s other ongoing programs that focus on orbital debris clearance and maintenance.
The Japan Aerospace Exploration Agency (JAXA) selected this mission as the initial phase of their Commercial Removal of Debris Demonstration Project. The second phase will progress onto the actual capture and removal of a debris object, but has yet to be contracted.
The mission is intended to prompt global discussions with governments and companies in the space industry on the implementation of Active Debris Removal (ADR) and will be following measures and processes outlined in the “Guidelines on a License to Operate a Spacecraft Performing On-Orbit Servicing” that was issued by the Japanese government in November 2021. These were the first guidelines of their kind and the result of consultation with various space agencies, ministries, and industry experts, including leading private space companies.
Astroscale has been uniquely focused on the safe removal of space debris since its creation in 2013 and is responsible for the entire lifecycle of this project — from design and construction to testing, launch, and subsequent operations.
ADRAS-J will be the first debris management mission to target an object of this size, with the H-IIA stage sitting at 11 x four meters. This stage launched the Greenhouse Gases Observing Satellite in 2009 and is part of an expendable launch system that has launched payloads since 2001, including the XRISM space telescope and SLIM lunar lander.
Other orbital debris management missions
The technology on ADRAS-J is leveraged from Astroscale’s End-of-Life Services by Astroscale Demonstration (ELSA-D) mission which completed close rendezvous operations between two semi-cooperative spacecraft in May 2022 where one spacecraft was equipped with a magnetic docking plate.
ELSA-D was developed via a partnership between the European and UK Space Agencies and satellite operator OneWeb and is set to be succeeded by the multi-spacecraft (ELSA-M) variant. ELSA-M intends to demonstrate a second capture and removal in the same mission as well as the advanced capture of objects that are in an uncontrolled spin or tumble.
Artists impression of COSMIC capturing its client (Credit: Astroscale)
Astroscale’s Clearing Outer Space Mission through Innovative Capture (COSMIC) is planned to be an evolution from the ELSA-M variant and will provide deorbiting functionality for defunct satellites as a commercial service to satellite operators. COSMIC is expected to launch in 2025 as part of the UK’s ADR initiative. It passed a system requirements review in late 2023 and is currently undergoing a preliminary design review of functionality, such as the proposed robotic arm and de-tumbling methods.
ClearSpace is another innovative UK organization that is leading a consortium, with the backing of the UK Space Agency, to design and execute a mission to clear up orbital debris.
Their mission, known as Clearing of the LEO Environment with Active Removal (CLEAR), entered the design review stage last October and intends to remove two UK-registered defunct objects from LEO that have been inactive for more than ten years. The CLEAR spacecraft will feature unfolding robotic arms that will be used to capture and release target objects. CLEAR is currently projected to launch in the second half of 2026 on a Vega C rocket from French Guiana.
Other low-cost methods explored include Millennium Space’s wide drag tape that massed less than one kilogram. The tape could be released by craft to self-deorbit cheaply and quickly. Of the two satellites deployed during the DRAGRACER mission in 2020, the satellite with the 70-meter drag tape deorbited within eight months, while the other satellite is estimated to take at least seven years before it deorbits.
Artist’s impression of one of the DRAGRACER satellites deploying the 70-meter Terminator Tape developed by Tethers Unlimited. (Credit: Millennium Space)
Are Earth’s orbits becoming too crowded?
Between 1957 and 2012, the number of satellites launched per year stayed reasonably consistent, as the United Nations noted in their “For All Humanity” review, at approximately 150. This period includes the pioneering years of human spaceflight and the development of the International Space Station (ISS) and global communications satellites. The number of satellite launches per year has increased exponentially in the last decade, however, exceeding 2000 by 2022.
Space is vast, but some are concerned that the volume of expired objects in orbit is increasing the potential for them to become a hazard to other spacecraft and the risk of their orbits decaying in less predictable ways. Communications satellite constellations such as Starlink and Kuiper will be major contributors, with both constellations eventually hoping to have thousands of satellites in their constellations.
Debris is not, of course, limited to retired craft and stages. NASA estimates that there are about 100 million particles of debris that are larger than one millimeter in size, of which around 25,000 are larger than 10 centimeters — the size of a softball.
The ISS adjusts its course if the possibility of a collision with debris exceeds one in 10,000 and has course-corrected over thirty times since 1999, according to NASA.
Illustration of the concentration of orbital debris from LEO to GEO. (Credit: NASA OPDO)
The greatest concentration of debris is between 750 and 1,000 kilometers in altitude, with the most debris orbiting within 2000 kilometers of the surface. The time that this debris could stay in space will range from a few years to over a century, depending on its altitude. Each year, somewhere between 200 and 400 of larger-tracked objects re-enter Earth’s atmosphere, though less than 100 of these objects are large enough to survive this process and reach the surface in some form.
The primary concern among scientists is “Kessler Syndrome,” named after NASA Scientist Donald Kessler, who wrote an influential paper in 1978 in which he warned that collisions in congested space could cause a domino effect of additional impacts that cascade to the point where LEO could become an untraversable debris field for many years.
It is estimated that a third of all cataloged debris can be traced to just two such “fragmentation events” in space. The most severe incident occurred in 2009 when the Iridium 33 satellite collided with the Russian Cosmos 2251 military satellite, which had exceeded its five-year lifespan and was no longer active. The United States military subsequently developed a process that includes daily screenings of all active satellites in orbit to anticipate and react to the risk of colliding objects.
The collision created almost 2000 pieces of debris over 10 centimeters in size, some of which have since decayed from orbit, while half of the Iridium debris and much of the Cosmos debris will remain in orbit for another 10 to 20 years.
The second incident was the intentional explosion of the Fengyun-1C meteorological satellite in 2007 during a Chinese anti-satellite missile test. At around 32,400 kilometers per hour, the force of the impact was enough to destroy the satellite without explosives.
The explosion subsequently created 3000 pieces of sizeable debris (which are still regularly tracked as a potential risk to the ISS) and around 150,000 particles, of which more than half could remain in orbit at around 850 kilometers for decades; the majority likely to remain into the next century.
Other types of space debris and asteroids
One final type of space debris is unexpected and lost items, such as lost items from an extravehicular activity (EVA) like hammers and tools.
Astronauts have been accidentally losing items, including tool bags, bolts, and spatulas since Ed White lost his spare thermal glove in 1965 during the first-ever American spacewalk outside the Gemini 4 capsule. Another tool bag was recently lost during an EVA to replace ISS solar array parts in November 2023 and is expected to burn up in the atmosphere this March.
Asteroid 99942 Apophis will make a sweeping pass closer to Earth than satellites in geostationary orbit in April 2029 and another in 2036. At 340 meters in width, this is the closest approach by an asteroid of this size for which we’ve had advance notice. With a path that will be inclined away from the equator, experts have ruled out any impact with objects in GEO (nor will it impact Earth for at least the next 100 years). The repurposed OSIRIS-REx spacecraft, now dubbed OSIRIS-APEX, that previously returned a sample of asteroid Bennu will study this asteroid for 18 months as it passes by.
Space tugs and trucks
Space tugs often deliver satellites to their final destination, such as further out in GEO, enabling them to preserve onboard propellant.
This will be an advancement from Northrop Grumman’s Mission Extension Vehicles (MEV), conceived almost a decade prior by Vivisat. With a 15-year lifespan, the vehicle design includes docking, servicing or deorbiting, undocking, and moving on to a new object.
MEV-1 approaching the Intelsat 901 satellite before docking. (Credit: Northrop Grumman)
MEV-1 attached itself to Intelsat 901 in 2020, taking over attitude and propulsion control to reposition the satellite into GEO, returning it to service for five years. MEV-2 simply acted as an additional fuel tank and engine for the Intelsat 10-02 satellite. The company has since shifted focus to a Mission Robotic Vehicle, which will deploy a series of Mission Extension Pods and could launch as early as this year.
With the biggest threat being the larger objects that still linger in LEO, the next generation of space tugs, such as Impulse Space’s forthcoming Helios tug, could potentially have a role to play in this new market at lower altitudes too.
Tom Mueller, CEO of Impulse Space, recently told NSF, “I would love to have this vehicle refuel in LEO, remove an object, and then stay up there, refuel again, and keep removing. One Helios, along with propellant depots, could really improve the efficiency of removing those big objects.”
(Lead image: Render of the ADRAS-J spacecraft in orbit. Credit: Astroscale)
The post ADRAS-J mission takes methodical first steps toward the commercial removal of space debris appeared first on NASASpaceFlight.com.
]]>After the Super Bowl ended on Sunday evening, crews began warming up the engines of the hundreds of business jets parked in every corner of the three airports around Las Vegas. Where were these jets headed and just how many left Las Vegas after the game?
A total of 525 business jets departed the Las Vegas area after the Super Bowl from Harry Reid International Airport, Henderson Executive Airport, and North Las Vegas Airport. 81 of those (15%) made the one hour flight to airports in the Los Angeles area. Airports in and near New York City saw 33 flights, while the Miami area saw 30. San Diego and San Jose round out the top five destination cities.
San Francisco and Kansas City make strong showings, obviously. Perhaps most surprising on the list of most popular destinations: Des Moines, Iowa, with seven flights.
Business jets departing Las Vegas after the game flew for an average 2 hours 1 minute, traveling to 173 airports in nine countries.
The aircraft that traveled the furthest after the game was VP-BLU, a Bombardier Global Express 7500, which spent 13 hours 20 minutes in the air flying to Ishigaki in southern Japan.
On the other end of the spectrum, N115TB, a Dassault Falcon 900EX spent just 29 minutes flying from North Las Vegas Airport to Kingman, saving 90 minutes off the estimated travel time by car.
Traffic departing Las Vegas following the Super Bowl. 525 business jets departed the Las Vegas area after the game ended. pic.twitter.com/N8GH2U1PvU
— Flightradar24 (@flightradar24) February 13, 2024
The post Leaving Las Vegas: business jets after the Super Bowl appeared first on Flightradar24 Blog.
]]>ESA’s very latest laboratory extension is portable in nature: hosted within a standard shipping container, this ESA Transportable Optical Ground Station, ETOGS, can be transported all across Europe as needed, to perform laser-based optical communications with satellites – including NASA’s Psyche mission, millions of kilometres away in space.
]]>The name aubrites comes from the village of Aubrés in France, where a similar meteorite fell on September 14th, 1836. The team responsible for recovering samples of this latest meteorite was led by SETI Institute meteor astronomer Dr. Peter Jenniskens and MfN researcher Dr. Lutz Hecht. They were joined by a team of staff and students from the MfN, the Freie Universität Berlin, the DLR, and the Technische Universität Berlin days after the meteor exploded in the sky. Together, they found the meteor fragments in the fields just south of the village of Ribbeck, about 50 km (31 mi) west of Berlin.
Finding the fragments was a major challenge because of the peculiar appearance of aubrites, which resemble rocks like any other from a distance but are quite different to look at up close. Whereas other types of meteors have a thin crust of black glass caused by the extreme heat generated by passing through the atmosphere, aubrites have a mostly translucent glass crust. Christopher Hamann, a researcher from the Museum für Naturkunde, was involved in the initial classification and participated in the search. As he related in a SETI Institute press release:
“Aubrites do not look like what people generally imagine meteorites to look like. Aubrites look more like a gray granite and consist mainly of the magnesium silicates enstatite and forsterite. It contains hardly any iron and the glassy crust, which is usually a good way to recognize meteorites, looks completely different than that of most other meteorites. Aubrites are therefore difficult to detect in the field.”
The asteroid (2024 BX1) was first spotted by Hungarian astronomer Dr. Krisztián Sárneczky using one of the telescopes at the Konkoly Observatory in Budapest. The task of tracking it and predicting where it would impact Earth’s atmosphere was performed by NASA’s Scout mission and the ESA’s Meerkat Asteroid Guard impact hazard assessment systems, with Davide Farnocchia of JPL/Caltech providing frequent trajectory updates. Like the Chelyabinsk meteorite that exploded over southern Russia in 2013, the explosion was witnessed by many and filmed (though the explosion caused no damage).
This was Jenniskens’ fourth guided recovery of a small asteroid that fell to Earth, the previous events being a 2023 impact in France, a 2018 impact in Botswana, and a 2008 impact in Sudan. As he explained, this latest asteroid was particularly challenging to track down:
“Even with superb directions by meteor astronomers Drs. Pavel Spurný, Jirí Borovicka, and Lukáš Shrbený of the Astronomical Institute of the Czech Academy of Sciences, who calculated how the strong winds blew the meteorites and predicted that these could be rare enstatite-rich meteorites based on the light emitted by the fireball, our search team initially could not easily spot them on the ground. We only spotted the meteorites after a Polish team of meteorite hunters had identified the first find and could show us what to look for. After that, our first finds were made quickly by Freie Universität students Dominik Dieter and Cara Weihe.”
This past week, Jenniskens’ colleagues at the MfN officially announced that they had conducted their first analyses of one of the meteor fragments. The process was led by Dr. Ansgar Greshake, the scientific head of the MfN’s meteorite collection, which consisted of an electron beam microprobe studying the mineralogy and chemical composition of the fragments. Their results revealed they the fragments are consistent with an achondrite meteor of the aubrite type, which were submitted to the International Nomenclature Commission of the Meteoritical Society on February 2nd, 2024, for verification.
“Based on this evidence, we were able to make a rough classification relatively quickly,” said Greshake. “This underlines the immense importance of collections for research. So far, there is only material from eleven other observed falls of this type in meteorite collections worldwide.”
Further Reading: SETI
The post Fragments From That Asteroid That Exploded Above Berlin Have Been Recovered and They're Really Special appeared first on Universe Today.
]]>The latest observations focus on a black hole region known as 3C 84, or Perseus A. It is a radio-bright source in a galaxy more than 200 million light-years away. Even the latest iteration of the EHT can’t resolve the horizon glow of the black hole as we’ve done with M87* and Sag A*, but it can see the bright region surrounding the black hole, where magnetic fields are particularly intense.
The 3C 84 black hole is located in the galaxy NGC 1275, which is part of the Perseus cluster. The galaxy is not just distant, it also has a central region rich in dust, which shrouds the black hole. Optical light can’t penetrate the region, but radio light can. The Event Horizon Telescope can also capture the polarization of radio light coming from the area. This is important because charged particles within a strong magnetic field emit polarized light. By mapping this polarization astronomers can study magnetic fields.
One focus of this work is to see how supermassive black holes can generate powerful jets that stream from the black hole at nearly the speed of light. Magnetic fields are key. As ionized matter falls into a black hole it can bring with it strong magnetic fields. These fields can pin to the accretion disk of a black hole, which intensifies fields in the region that it becomes difficult for the black hole to capture more matter. This is known as a magnetically arrested disk.
One idea is that as the magnetically arrested disk rotates around the black hole, magnetic field lines become twisted, winding ever tighter and trapping magnetic energy. The release of this energy through magnetic realignment could power the formation of ionized jets. While such a magnetic realignment hasn’t been observed, this study shows that we might be able to capture such an event.
Reference: Paraschos, G. F., et al. “Ordered magnetic fields around the 3C 84 central black hole.” Astronomy & Astrophysics 682 (2024): L3.
The post The Event Horizon Telescope Zooms in on a Black Hole's Jet appeared first on Universe Today.
]]>An international team of astronomers led by UCLA’s Smadar Naoz is doing simulations of early galaxy formation. Their computer programs track the circumstances of galactic births not long after the Big Bang. These “hot off the press” computer models include some new wrinkles. They take into account previously neglected interactions between dark matter and the primordial “stuff” of the Universe. That would be hydrogen and helium gas. The result of the simulations: tiny, bright galaxies that formed more quickly than in computer models that didn’t include those motions. Now astronomers just need to find them, using JWST, in an effort to see if their theories of dark matter hold up.
How would interactions between baryonic matter and dark matter make a difference? Here’s one likely story: in the early Universe, clouds of gas moved at supersonic speeds past clumps of dark matter. It bounced off the dark matter. Eventually, after millions of years, the gaseous material fell back together to form stars in a blast of star birth. The team’s simulations track the formation of those galaxies right after the Big Bang.
Naoz’s team thinks that the existence of those smaller, brighter, more distant galaxies could confirm the so-called “cold dark matter” model. It suggests that the Universe was in a hot dense state containing only gases after the Big Bang. Over time, it evolved to a lumpy distribution of galaxies (and eventually galaxy clusters). Along the way, stars and galaxies formed, but the earliest steps likely depend on gravitational interaction with dark matter. If the supersonic interactions that Naoz’s team modeled actually happened, then those little galaxies would be the result.
JWST has seen some pretty early galaxies during its time in operation. It hasn’t detected the very earliest ones—yet. However, the images it HAS provided are tantalizing hints at what might exist in earlier epochs and could provide insight into the role of dark matter. So, it makes sense that astronomers want to push its view back in time as far as they can. And, that means looking for bright patches of light that existed a few hundred million years after the Big Bang.
“The discovery of patches of small, bright galaxies in the early universe would confirm that we are on the right track with the cold dark matter model because only the velocity between two kinds of matter can produce the type of galaxy we’re looking for,” said Naoz. “If dark matter does not behave like standard cold dark matter and the streaming effect isn’t present, then these bright dwarf galaxies won’t be found and we need to go back to the drawing board.”
In a paper by the team member and first author Claire Williams (published in Astrophysical Journal Letters) the team suggests that scientists using JWST begin to look for galaxies that are much brighter than expected. If they exist, that will likely prove the interactions occurred early in cosmic time. If none can be found, then maybe scientists still might not understand dark matter interactions. The big question to answer is, if they exist, then how did they form so quickly and why are they so bright?
Let’s examine that by looking at the role of dark matter. The standard cosmological model says that the gravitational pull of clumps of dark matter in the early Universe attracted ordinary matter. Eventually, that caused stars to form, followed by galaxies. Dark matter is thought to move more slowly than light. So, astronomers predicted that the star- and galaxy-formation processes happened very gradually. At least, that’s what earlier simulations suggest.
But, what if something else was going on more than 13 billion years ago? How would that change things? It was a time before the first galaxies formed. But, it was a time when ordinary matter in the form of large overdensities of hydrogen and helium gas streamed through the expanding Universe. It bounced off slower-moving clumps of dark matter and outran its gravitational pull, at least for a time. Then, the baryonic matter massed together again, under the influence of dark matter. That’s when the star birth fireworks began.
“While the streaming suppressed star formation in the smallest galaxies, it also boosted star formation in dwarf galaxies, causing them to outshine the non-streaming patches of the universe,” Williams said. Essentially, the accumulated gas began to fall together after millions of years. That led to a huge burst of star formation. Lots of massive hot, young stars began to shine, out-brilliancing the stars in other small galaxies. Ultimately what this means is that since dark matter is impossible to “see”, those brightly shining patches of galaxies could be indirect evidence of its existence. And, they’d prove the role dark matter played in the creation of galaxies.
Nobody’s seen exactly what Naoz and the team are looking for—yet. Once they do, it will go a long way toward providing insight into the role of cold dark matter. “The discovery of patches of small, bright galaxies in the early universe would confirm that we are on the right track with the cold dark matter model because only the velocity between two kinds of matter can produce the type of galaxy we’re looking for,” said Naoz.
Of course, JWST is a perfect telescope to help see these galaxies. It should be able to peer into regions of the Universe where tiny infant galaxies are brighter than astronomers expect them to be. That extreme luminosity will help JWST spot them, showing them as they looked at a time when the Universe was only a few hundred million years old. Because dark matter is impossible to study directly, searching for those bright patches of baby galaxies in the early Universe could offer an effective test for theories about dark matter and its role in shaping the first galaxies.
Bright Galaxies Put Dark Matter to the Test
The Supersonic Project: Lighting Up the Faint End of the JWST UV Luminosity Function
The post Webb Can Directly Test One Theory for Dark Matter appeared first on Universe Today.
]]>