- API data.nasa.gov | Last Updated 2018-07-19T10:29:29.000Z
We propose a rover architecture for Europa and other planetary environments where soft robotics enables scientific investigation or human-precursor missions that cannot be accomplished with solar or nuclear power. This rover resembles a squid, with tentacle- like structures that serve both as electrodynamic tethers to harvest power from locally changing magnetic fields and as a means of bio-inspired propulsion. The electrical energy scavenged from the environment powers all rover subsystems, including one that electrolyzes H20. Electrolysis produces a mixture of H2 and O2 gas, which is stored internally in the body and limbs of this rover. Igniting this gas expands these internal chambers, causing shape change to propel the rover through fluid or perhaps along the surface of a planetary body. The Phase I effort constitutes advancement of this revolutionary rover concept from TRL 1 to TRL 2. The work will be conducted at Cornell University, led by PI Mason Peck and Co-I Robert Shepherd. If the concept eventually succeeds, it will enable amphibious exploration of gas-giant moons, notably Europa. It likely is relevant to other moons of Jupiter and Saturn with liquid lakes or oceans. Juno’s success notwithstanding, solar power near Jupiter is very limited. Furthermore, the recent cancellation of SMD’s ASRG technology motivates alternatives to nuclear power. The bio-inspired technologies we propose to consider bypass the need to power rovers with limited-lifetime batteries, large solar arrays, or nuclear power. In this one respect, it is a breakthrough concept. Beyond addressing issues of power, this rover concept also bypasses the difficulties of typical mechanisms in fluid through uniquely suited soft robotics. The expanding-gas locomotion concept is both exotic and eminently realizable, grounded in experimental work by our team.
A millimeter/submillimeter heterodyne sensor for spectroscopy and imaging of cold planetary objects in the outer solar systemdata.nasa.gov | Last Updated 2018-09-07T17:42:50.000Z
We propose to develop a tunable millimeter/submillimeter-wave antenna coupled graphene mixer-based heterodyne sensor on future missions to icy moons (Titan, Enceladus, and Europa) of the outer planets of our solar system. The proposed technology is focused on the "Ocean Worlds" mission theme and will enable to address challenge questions - significance of abiological synthesis of organics, i.e., hydrocarbons, their derivatives, and ions (carbocations and carbanions) - in the origin of life on Earth. Investigating this abiotic chemistry is a necessary step in the quest to understand the origin of life and habitability context. The goal of planetary exploration research is the identification of molecules of prebiotic importance (e.g. the aromatic rings, heterocycles, nucleic acid bases, and amino acids) in space through rotational spectroscopy, and the elucidation of their chemical structures using the proposed heterodyne detection technique. The detector system is based on non-linear 2 wave mixing of the weak THz radiation and a THz local oscillator in a graphene photo-thermo-electric detector (Cai et al., Nature Nanotechnology 9(10) 814 (2014)). This results in a heterodyne response for the weak THz signal with the difference frequency relative to the local oscillator. The noise temperatures of noncryogenic graphene mixers are expected to fall between respective values of superconducting SIS heterodyne detectors and hot electron bolometer HEB square law mixers and require less local oscillator power than Schottky diode mixers. We expect to bring this technology from TRL1 to TRL4. Besides hydrogen and water (as water ice), methane and methanol constitute as the major molecular matter in our solar system. We identified "Olah's nonclassical carbonium ion chemistry" as preferred chemical pathway for abiotic synthesis of organic compounds on Titan, and this pathway begins by the conversion of most abundant methane to methonium ion CH5+ (Ali et al., P & SS, 109-110 (2015) 46; Puzzarini et al., AstronJ. 154: 82(2017 )). Because of the observed higher reactivity of methanol with respect to methane, a feasible new pathway has recently been proposed for the conversion of methanol to various extraterrestrial abiological organics together with a possible connection with methonium ion-based chemistry (Olah et al., JACS. 138 (2016) 1717). If methanol (a derivative of methane) indeed has been delivered by meteorites or comets to the icy surface of Europa in the chemistry of formation of our solar system, it is suggested that the radiation- catalyzed surface chemistry of Europa could also mimic the observed prebiotic chemistry on Titan by the Cassini-Huygens mission. Rotational spectroscopy in space from future orbiting spacecrafts (Titan and Europa Orbiter) would provide important information of atmospheric and surface-assisted prebiotic chemistry using the proposed heterodyne sensor. On the detection of any biological process or chemical markers for extraterrestrial life, spectroscopy of representative building blocks of life in the Ocean Worlds of the outer solar system (e.g. spectrum of Enceladus plume composition) is an important step. The proposed graphene mixer-based sensor will also enable the measurements of winds with high velocity resolution from Doppler shift studies of optically thin rotational lines. In addition, this technology will permit a sensitive array receiver configuration for thermal imaging of surfaces of cold planetary bodies in the outer solar system with spectral resolution. A major breakthrough in technical challenges on the design and development of the sensor proposed will put studies of spectroscopy, chemistry and dynamics of the environments of icy moons of outer planets in our solar system, and will provide a lasting impact on the "relevance and significance of extraterrestrial abiological hydrocarbon chemistry" in the origin of life as stated in the Planetary Science Decadal Survey (2013-2022).
- API data.nasa.gov | Last Updated 2018-07-19T17:56:25.000Z
Regression problems on massive data sets are ubiquitous in many application domains including the Internet, earth and space sciences, and finances. In many cases, regression algorithms such as linear regression or neural networks attempt to fit the target variable as a function of the input variables without regard to the underlying joint distribution of the variables. As a result, these global models are not sensitive to variations in the local structure of the input space. Several algorithms, including the mixture of experts model, classification and regression trees (CART), and others have been developed, motivated by the fact that a variability in the local distribution of inputs may be reflective of a significant change in the target variable. While these methods can handle the non-stationarity in the relationships to varying degrees, they are often not scalable and, therefore, not used in large scale data mining applications. In this paper we develop Block-GP, a Gaussian Process regression framework for multimodal data, that can be an order of magnitude more scalable than existing state-of-the-art nonlinear regression algorithms. The framework builds local Gaussian Processes on semantically meaningful partitions of the data and provides higher prediction accuracy than a single global model with very high confidence. The method relies on approximating the covariance matrix of the entire input space by smaller covariance matrices that can be modeled independently, and can therefore be parallelized for faster execution. Theoretical analysis and empirical studies on various synthetic and real data sets show high accuracy and scalability of Block-GP compared to existing nonlinear regression techniques.
Global Flood Risk From Advanced Modeling and Remote Sensing in Collaboration With Google Earth Enginedata.nasa.gov | Last Updated 2018-07-19T07:18:46.000Z
As predictive accuracy of the climate response to greenhouse emissions improves, measurements of sea level rise are being coupled with modeling to better understand coastal vulnerability to flooding. Predictions of rising intensity of storm rainfall and larger tropical storms also imply increased inland flooding, and many studies conclude this is already occurring in some regions. Most rivers experience some flooding each year: the seasonal discharge variation from low to high water can be 2-3 orders of magnitude. The mean annual flood is an important threshold: its level separates land flooded each year from land only affected by large floods. We lack adequate geospatial information on a global basis defining floodplains within the mean annual flood limit and the higher lands still subject to significant risk (e.g. with exceedance probability of greater than 3.3%; the 30 yr floodplain). This lack of knowledge concerning changing surface water affects many disciplines and remote sensing data sets, where, quite commonly, a static water 'mask' is employed to separate water from land. For example, inland bio-geochemical cycling of C and N is affected by flooding, but floodplain areas are not well constrained. Measurements and computer models of flood inundation over large areas have been difficult to incorporate because of a scarcity of observations in compatible formats, and a lack of the detailed boundary conditions, in particular floodplain topography, required to run hydrodynamic models. However, the available data now allow such work, and the computational techniques needed to ingest such information are ready for development. Optical and SAR sensing are providing a near-global record of floodplain inundation, and passive microwave radiometry is producing a calibrated record of flood-associated discharge values, 1998-present. Also, global topographic data are of increasingly fine resolution, and techniques have been developed to facilitate their incorporation into modeling. Several of us have already demonstrated the new capability to accurately model and map floodplains on a continent scale using input discharges of various sizes and exceedance probabilities. Work is needed to accomplish global-scale products, wherein results are extended to all continents, and downscaled to be locally accurate and useful. Floodplain mapping technologies and standards vary greatly among nations (many nations have neither): the planned effort will provide a global flood hazard infrastructure on which detailed local risk assessment can build. Our project brings together an experienced team of modeling, remote sensing, hydrology, and information technology scientists at JPL and the University of Colorado with the Google Earth Engine team to implement and disseminate a Global Floodplains and Flood Risk digital map product. This project addresses major priorities listed in the AIST program: with Google, we would identify, develop, and demonstrate advanced information system technologies that increases the accessibility and utility of NASA science data and enables new information products. The work will address the Core Topic 'Data-Centric Technologies', including 'Technologies that provide opportunities for more efficient interoperations with observations data systems, such as high end computing and modeling systems; and Capabilities that advance integrated Earth science missions by enabling discovery and access to Service Oriented Architecture'. It will also address the Special Subtopic 'Technology Enhancements for Applied Sciences Applications' in regard to natural disasters, and contribute to the GEOSS architecture for the use of remote sensing products in disaster management and risk assessment.
- API data.nasa.gov | Last Updated 2018-07-19T07:58:15.000Z
<p>The objective of the Advanced Exploration Systems (AES) Logistics Reduction (LR) project's Universal Waste Management System (UWMS) task is to develop a compact toilet system that can be used across multiple future crewed vehicles and habitats. The UWMS effort will result in a toilet with reduced mass and volume that provides increased crew comfort and performance. A key feature of the UWMS is the urine pretreatment dose pump/pretreatment quality indication device which enables water recovery from urine. The UWMS core hardware is primarily funded by the AES LR project, with cost sharing from the Orion (MultiPurpose Crew Vehicle (MPCV)) Program for a second flight unit, and the ISS Program for integration hardware for flying the first UWMS unit on ISS. </p><p>Procurement of the UWMS started late in 2015. UWMS Integration and ISS planning began in 2015 and will continue through 2019 in collaboration with the ISS Payloads Office and the Orion Program. </p><p>The UWMS will be flown on the ISS as a technology demonstration payload in FY19. The ISS UWMS demonstration will validate the hygienic collection of urine and feces. Effective collection is critical to maintain crew health and hygiene for long-duration habitats. Additionally, the ISS technology demonstration will demonstrate the ability to pretreat urine and deliver it the ISS urine processor. The pretreat pump and pretreat quality sensor are important components of a future long-term space habitat water recovery system. A second UWMS unit will be developed to fly on the MPCV Exploration Mission-2 (EM-2) mission.</p><p>The waste management team collaborates on a Phase 2 SBIR on torrefaction of feces to assess it's feasibility for space missions. Additionally, the UWMS team collaborates with the MPCV's Launch, Entry, and Abort (LEA) space suit team for contingency urine and fecal waste collection systems. The LEA suit waste collection system is used if the MPCV loses cabin pressure but it may be possible to use the LEA contingency system to provide a backup for waste collection if the UWMS became inoperable.</p>
- API data.nasa.gov | Last Updated 2018-07-19T07:46:32.000Z
NASA has invested significant effort in the past decade in developing and maturing technologies that enable efficient and effective use of Next-generation (NextGen) Vertical Lift (VL) systems for a broad class of missions and operations. One of the key barriers it faces to the widespread use of VL vehicles within the National Airspace is the cost of maintenance on the vehicles to keep them safe and reliable. Qualtech Systems, Inc (QSI) in collaboration with Lockheed Martin - Mission Systems and Training (LM-MST) seeks to address these maintenance challenges by fielding a predictive Condition Based Maintenance Plus (CBM+) solution leveraging a diagnostic reasoner TEAMS-RDS (Testability Engineering And Maintenance System Remote Diagnosis Server) and prognostic algorithms. CBM+ involves inferring, tracking and forecasting of system degradation based on state awareness acquired from monitored data through fault detection, isolation, identification, diagnosis and prognosis techniques and to proactively plan maintenance actions to improve system availability and safety. QSI-LM's CBM+ solution will furnish the ability to keep the vehicle health status continually ahead of an advancing failure accumulation through a predictive maintenance strategy geared towards replacement-while-in-operation before the ensuing failures render the VL vehicle inoperable. Diagnosis will focus on current health state identification through detection, isolation, root cause analysis and identification of faults that have already occurred, while prognosis will leverage the current health state identification and forecast performance degradation, incipient component failures and probability density (or moments) of remaining useful life (RUL) or Time to Maintenance (TTM) or Time to Failure (TTF). It is anticipated that the CBM+ solution will leverage the currently existing communication capabilities between the aircraft, the pilot and ground-support personnel in a seamless and automated manner.
- API data.nasa.gov | Last Updated 2018-07-19T04:57:07.000Z
This data set contains the data from the Galileo dust detector system (GDDS) from start of mission through the end of mission. Included are the dust impact data, noise data, laboratory calibration data, and location and orientation of the spacecraft and instrument.
- API data.nasa.gov | Last Updated 2018-09-07T17:42:46.000Z
There is a vast amount of SAR data that is challenging for scientists to use. We propose a variety of technologies in SAR processing that will accelerate the processing and the use of the science products. Specifically, we will: 1) Develop methods of computational acceleration by exploiting back projection methods on cloud-enabled GPU platforms to directly compute focused imagery in UTM (landsat grid). This will deliver SAR data to users as user-ready products, in a form that is most familiar to them from optical sensors and which has never been done before. It has been a major obstacle for scientists to adopt radar data. Once formed, the data can be accessed on standard GIS platforms. We could greatly reduce the processing complexity for users so they can concentrate on the science, and bring the products seamlessly into the 21st century tools that are rapidly evolving to handle the developing data explosion; 2) Develop python-based framework technologies at the user interface that support a more natural way for scientists to specify products and actions, thereby accelerating their ability to generate science results 3) Extend the ESTO-funded InSAR Scientific Computing Environment framework to uniformly treat polarimetric and interferometric time-series such as those that will be created by the NISAR mission using serialized product-based workflow techniques. There are several key challenges that need to be addressed in parallel: 1) speed and efficiency in handling very large multi-terabyte time-series imagery data files. This requires innovations in multi-scale (GPU, node, cluster, cloud) workflow control; 2) framework technologies that can support the varied algorithms that these data can support, from SAR focusing, interferometry, polarimetry, interferometric polarimetry, and time-series processing; framework technologies that can support heterogeneous, multi-sensor data types (point-clouds and raster) in time and space. NASA’s upcoming radar mission, NISAR, will benefit from this technology after its planned launch in 2021, but first the vast archive of all international missions such as the Sentinel-1 A/B data at the Alaska Satellite Facility can be exploited more fully.
- API data.nasa.gov | Last Updated 2018-07-19T13:10:14.000Z
We propose to design, model, build, and test a novel flash cracking reactor to convert plastic waste, and potentially other unconventional hydrocarbon feedstocks, into tunable molecular weight fuels. The innovative reactor technology "flashes off" desired hydrocarbon products as they form, thus preventing the over-cracking of the polymers into more volatile hydrocarbons. This leads to improved selectivity for low vapor-pressure hydrocarbons, which are easier to store as fuel in large quantities at low pressures, as well as tunable molecular-weight products for multiple applications. Our design approach in Phase I will use a combination of heat/mass transfer modeling with pyrolytic kinetics modeling for PE and PP, which will be used as a model system for waste plastic pyrolysis. We will first demonstrate, using our pyrolytic model, that the hydrocarbon product distribution can be modified and tailored by varying the reactor and condenser temperatures, nitrogen gas flow rate, and system pressure. We will also build and test the reactor system based on our model results. Controlling the product distribution of a flash cracking reactor while minimizing parasitic losses will be the primary challenge during the Phase I effort.
- API data.nasa.gov | Last Updated 2018-07-19T08:28:00.000Z
Robots need to know their location to map of their surroundings but without global positioning data they need a map to identify their surroundings and estimate their location. Simultaneous localization and mapping (SLAM) solves these dual problems at once. SLAM does not depend on any kind of infrastructure and is thus a promising localization technology for NASA planetary missions and for many terrestrial applications as well. However, state-of-the-art SLAM depends on easily-recognizable landmarks in the robot's environment, which are lacking in barren planetary surfaces. Our work will develop a technology we call MeshSLAM, which constructs robust landmarks from associations of weak features extracted from terrain. Our test results will also show that MeshSLAM applies to all environments in which NASA's rovers could someday operate: dunes, rocky plains, overhangs, cliff faces, and underground structures such as lava tubes. Another limitation of SLAM for planetary missions is its significant data-association problems. As a robot travels it must infer its motion from the sensor data it collects, which invariably suffers from drift due to random error. To correct drift, SLAM recognize when the robot has returned to a previously-visited place, which requires searching over a great deal of previously-sensed data. Computation on such a large amount of memory may be infeasible on space-relevant hardware. MeshSLAM eases these requirements. It employs topology-based map segmentation, which limits the scope of a search. Furthermore, a faster, multi-resolution search is performed over the topological graph of observations. Mesh Robotics LLC and Carnegie Mellon University have formed a partnership to commercially develop MeshSLAM. MeshSLAM technology will be available via open source, to ease its adoption by NASA. In Phase 1 of our project we will show the feasibility of MeshSLAM for NASA and commercial applications through a series of focused technical demonstrations.