- API data.nasa.gov | Last Updated 2018-07-19T08:08:25.000Z
<p>Collision avoidance for unmanned aerial systems (UAS) traveling at high relative speeds is a challenging task. It requires both the detection of a possible collision and deployment of an appropriate maneuver to avoid it, to be done in few seconds or less. NASA Langley and Boston University are engaged in a collaborative effort to design neuromorphic optic flow algorithms to avoid collisions and embed these algorithms in small, low-weight, and low-power customized hardware solutions in UAS.<p/><p>Using biologically-inspired neuromorphic optic flow algorithms is a novel approach in collision avoidance for UAS. Traditional computer vision algorithms rely on solving nonlinear partial differential equation systems to estimate optic flow which is a computationally expensive task. Neuromorphic algorithms instead make use of lessons learned in biology to solve these problems in a more efficient manner. An example is the fly's motion detector, which can be modeled by a system that uses a set of locally calculated, parallel spatio-temporal correlations for a set of velocities determined by the input sampling rates and flying speeds. Correlation results are interpreted as likelihood for a motion direction and speed. Stages of obstacle detection and tracking can temporally and spatially integrate these likelihoods to increase the signal-to-noise ratio, and in turn the detection rate. In addition to its computational efficiency, the proposed neuromorphic solution is more stable and noise tolerant than solving a nonlinear optimization problem. Even if single computational nodes get corrupted due to functional or structural failures in the hardware, the performance of appropriately designed parallel, distributed neuromorphic algorithms degrades gracefully. Neuromorphic algorithms are commonly implemented using software running on general-purpose multicore/graphic processing unit systems. This approach, though flexible, can have significant overhead in terms of power, performance, and is not easily portable across platforms, therefore reducing its scope of applicability. In the second phase, we will port the neuromorphic algorithms to field programmable gate arrays (FPGAs) and application specific integrated chips (ASICs). This will allow us to meet demanding performance requirements needed in UAS such as fast processing, low weight, low power consumption, as well as robustness to hardware failure.</p>
- API data.nasa.gov | Last Updated 2018-07-20T06:54:34.000Z
New wind tunnel flow quality test and analysis procedures have been developed and will be used to establish standardized turbulent flow quality measurement techniques and data reduction procedures for future flow quality studies in the National Transonic Wind Tunnel (NTF) and other Aeronautics Test Program (ATP) facilities. To date, few measurements have been made of the characteristics of freestream turbulence in transonic wind tunnels, and details of the amplitude and spectra of freestream velocity and pressure fluctuations is lacking. Consequently, there is an urgent need for in-situ measurements to determine flow quality and the performance of turbulence and noise suppression devices. This information is required if we are to accurately assess and characterize ground test facility performance. To meet these challenges, a unique research program is proposed to clarify and alleviate the aerodynamic problems associated with adverse wind tunnel flow quality. It combines innovative advances in data base assessment and management, and new approaches to turbulence instrumentation and analysis. Standardized turbulence measurement techniques and data analysis procedures will be established and used to document the flow quality in our major test facilities.
- API data.nasa.gov | Last Updated 2018-07-19T07:04:29.000Z
This data set gives the best available values for ion densities, temperatures, and velocities near Neptune derived from data obtained by the Voyager 2 plasma experiment. All parameters are obtained by fitting the observed spectra (current as a function of energy) with Maxwellian plasma distributions, using a non-linear least squares fitting routine to find the plasma parameters which, when coupled with the full instrument response, best simulate the data. The PLS instrument measures energy/charge, so composition is not uniquely determined but can be deduced in some cases by the separation of the observed current peaks in energy (assuming the plasma is co-moving). In the upstream solar wind protons are fit to the M-long data since high energy resolution is needed to obtain accurate plasma parameters. In the magnetosheath the ion flux so low that several L-long spectra (3-5) had to be averaged to increase the signal-to-noise ratio to a level at which the data could be reliably fit. These averaged spectra were fit using 2 proton maxwellians with the same velocity. The values given in the upstream magnetosheath are the total density and the density-weighted temperature. In both the upstream solar wind and magnetosheath full vector velocities, densities and temperatures are derived for each fit component. In the magnetosphere spectra do not contain enough information to obtain full velocity vectors, so flow is assumed to be purely azimuthal. In some cases the azimuthal velocity is a fit parameter, in some cases rigid corotation is assumed. In the 'outer' magnetosphere (L>5) two distinct current peaks appear in the spectra H+ and N+. In the inner magnetosphere the plasma is hot and the composition is ambiguous, although two superimposed Maxwellians are still required to fit the data. These spectra are fit using two compositions, one with H+ and N+ and the second with two H+ components. The N+ composition is preferred by the data provider. All fit values in the magnetosphere come with one sigma errors. It should be noted that no attempt has been made to account for the spacecraft potential, which is probably about -10 V in this region and will effect the density and velocity values. In the outbound magnetosheath and solar wind both moment and fit values are given for velocity, density, and thermal speed. The signal-to-noise ratio in the M-longs is very low, especially near the magnetopause, which can result in the analysis giving incorrect values. The L-long spectra have too low an energy resolution to permit accurate determinations parameters in many regions temperature and non-radial velocity components may be inaccurate.
- API data.nasa.gov | Last Updated 2018-07-19T08:02:03.000Z
(1) Identify and evaluate CO2/CO separation technologies that are compatible with the high operating temperatures (700-850oC) of the Solid Oxide Electrolysis process. (2) Identify and evaluate CO2 Acquisition technology options. (3) MARCO POLO Atmospheric Processing Module (APM): verify the operation of the CO2 pump and the associated storage system, complete setup and testing of the Sabatier subsystem and operate it with the CO2 freezers to ready the APM for a potential analog demonstration with other components of MARCO POLO at KSC and/or JSC.
- API data.nasa.gov | Last Updated 2019-06-03T15:18:56.000Z
This data set provides organic soil layer characteristics, estimated carbon content, and soil depth measurements made at four black spruce stands in interior Alaska that had burned twice in the last 37-52 years (intermediate-interval fire events). The most recent fires occurred in 2004, 2005, and 2010. Measurements of soil depth and distance from the adventitious roots to the soil, and total organic matter are also included for unburned black spruce sites adjacent to the burned sites dominated by live, intermediate-aged (~37-52 years) black spruce trees.
- API data.nasa.gov | Last Updated 2018-07-19T09:50:29.000Z
The primary objective of the Phase I investigation is to develop and demonstrate an innovative solution that can enable very high precision pointing accuracy (<0.08 degree nominal; <0.03 degree extended goal) at fast slew rates; providing part of a advanced Smallsat/CubeSat precision attitude determination and control system (PADCS) that can meet emerging very stringent missions requirements. The Phase I program aim is to design and fabricate initial prototype hardware, including power electronics and Reaction Wheel Assembly (RWA) modifications as to demonstrating such positional accuracy capability, power cost (peak and average power consumption), slew rates and mass/volume cost of the new solution. A critical objective of Phase I will be to develop at the decoupled control architecture for the new multi-stage Attitude Control System ACS controller that will be modeled, simulated, and then converted to hardware prototype for Phase I assessments. This goal is to integrate this prototype controller into a multi-stage (ACS) design hardware emulation testbed and evaluate actual performance before conclusion of the program.
- API data.nasa.gov | Last Updated 2018-07-19T09:42:19.000Z
Fault Management (FM) is one of the key components of system autonomy. In order to guarantee FM effectiveness and control the cost, tools are required to automate fault-tree generation and updates based on design models specified in standardized design languages such as AADL. Accordingly, we propose a fault tree generation and augmentation environment (FTGA). Equipped by a fault class model and an FM method catalog, FTGA evaluates not only failure behavior in the application under analysis but also FM's capability and adequacy for failure mitigation. Moreover, when an inadequacy in FM is revealed during fault tree generation or analysis, the fault tree will be allowed for augmentation through FM method insertion and be followed by a quantitative evaluation for FM effectiveness validation. Therefore, unlike traditional fault tree analysis which plays a passive role in FM, the automated FTGA environment actively and explicitly influence system design and updates, enabling "fault-tree-in-the-loop" for a system's life cycle. Further, by separating its generic functions (which we collectively call "shared package") from design-language-specific functions (which we collectively call "interface package"), FTGA will be an extensible modeling environment. The anticipated results from the Phase I project will be a preliminary prototype of FTGA and a demonstration for concept validation.
- API data.nasa.gov | Last Updated 2018-07-19T11:57:00.000Z
An inherently rugged Universal Oscillator (UO) is needed to enable a superior class of configurable communications for NASA applications. The requirements are a low phase noise RF output concurrently with a rugged, reliable, small, power efficient, and frequency tuning ability. VIDA Products has developed technology that will ultimately enable an integrated circuit YIG oscillator with high Q resonators and low power consumption that fills these requirements. The high Q YIG resonators are made possible by quantum electron spin precession and are essentially lossless. In general, a resonator is realized by a YIG sphere RF magnetic fields coupling to the oscillator circuit structure. A bias magnetic field on the spheres at a right angle to the coupling field vector sets the frequency of operation. It is a linear function of exactly 2.8 MHz per Gauss. Its equivalent electrical circuit is composed of circuit elements unrealizable by finite components that vary over frequency so the filter bandwidth does not change with tuned frequency. Thus the Q increases with frequency since a definition of Q is the tuned frequency divided by the bandwidth. For oscillators using these resonators the phase noise is excellent and continues to perform as the oscillation frequency increases. The Resonant Ring Oscillator topology is easily realizable using MMIC technology to reduce a YIG based oscillator to a single IC with the ability to produce external fields of the correct vectors and path losses. To make use of this phenomenon, a proprietary circuit utilizing leakage shielding and frequency locking will control the magnetic bias field and be integrated in the UO IC. The result will be a Universal Oscillator that can be produced to operate at any frequency between 3 and 30 GHz at cost equivalent to current VCO technology but with 30 to 40 dBc improvement in phase noise performance. Completing the development of this technology now will save immeasurable resources.
- API data.nasa.gov | Last Updated 2019-06-03T15:19:07.000Z
There are two data files (tab-delimited .txt format) with this data set that provide estimates of above-ground biomass per county; county-level annual above-ground biomass growth, removals (harvest), and mortality of woody biomass per hectare; county-level total annual above-ground woody biomass production per hectare; forest area per county; mortality (%) in forests within each county; and total annual production and mortality per county. The data provide annual mean above-ground wood increments for temperate forests in 1,956 counties of the 28 eastern US states. The data are derived from forest inventory data from 1960s to 1990s that were collected from an extensive network of permanent inventory plots as part of the US Department of Agriculture Forest Service Forest Inventory and Analysis (FIA). Based on the analysis of the above-ground production data (Brown and Schroeder, 1999), above-ground production of woody biomass (APWB) for hardwood forests ranged from 0.6 to 28 Mg/ha/yr and averaged 5.2 Mg/ha/yr. For softwood forests, APWB ranged from 0.2 to 31 Mg/ha/yr and averaged 4.9 Mg/ha/yr. APWB was generally highest in southeastern and southern counties, mostly along an arc from southern Virginia to Louisiana and eastern Texas. No clear spatial pattern of mortality of woody biomass (MWB) existed, except for a distinct area of high mortality in South Carolina as a result of Hurricane Hugo in 1989. For hardwood forests, MWB ranged from 0 to 15 Mg/ha/yr and averaged 1.1 Mg/ha/yr. The average MWB for softwood forests was 0.6 Mg/ha/yr with a range of 0 to 10 Mg/ha/yr. The rate of above-ground MWB averaged <1%/yr for both hardwood and softwood forests. Revision Notes: Only the documentation for this data set has been modified. The data files have been checked for accuracy and are identical to those originally published in 2003.
- API data.nasa.gov | Last Updated 2018-07-19T22:52:31.000Z
Proteomic studies in microgravity are crucial to understanding the health effects of spaceflight on astronauts. Unfortunately, existing tools for measuring protein, antibody, and enzyme expression are limited to earth-borne laboratories due to their complexity and size. This proposal offers a novel technology that provides a palm-top platform suitable for real-time experiments on the Space Shuttle or International Space Station. The technology uses nanoelectronic transistors coupled to antibody bioprobes to provide a label-free "direct detection" system that is rapid and easy to use with minimal skill. The system is completely self-contained, including all reagents and waste products, and operated from a PDA-style handheld computer. Phase I will demonstrate the detection concept and Phase II will deliver prototype units for testing.