Monolithic Power Integrated Circuits for Merging Power Electronics, Management, and Distribution, Phase Idata.nasa.gov | Last Updated 2018-07-19T08:23:32.000Z
APIQ Semiconductor proposes development of a scalable, wide bandgap (WBG) monolithic power integrated circuit (MPIC) technology for power electronic conversion, management, and distribution. The proposed WBG microelectronics are to be based upon low defect, homogeneous gallium nitride (GaN) based materials using native GaN substrates. The technology to be developed will replace silicon power switches and drivers in power electronics systems to yield high efficiency, high density, reliable module based systems. Inclusive in the proposal are devices for 1200 V or more power switching and digital integration. Devices will be evaluated for high temperature and heavy ion radiation hardness, with performance improvements over competing technologies expected from low materials defects and carefully managed electric field profiles.
- API data.nasa.gov | Last Updated 2018-07-20T07:17:16.000Z
The investigation of the coating friction as a function of time is important to monitor the ball bearing heath. Despite the importance of the subject mater, there is a crucial lack of information in the literature about coating life and friction force in ball bearings as coating wear of progressively increases. Here we propose to develop a strategic space vehicle health monitoring system that will identify potential and/or imminent lubrication problems, analyze these parameters in real time, and provide direct input so that these problems are mitigated prior to failure. We will set up a lab experiment environment with a universal microtribometer and acoustic emission sensors measuring the signals associated with wear and the changes that tend to occur as a function of time. Friction force and acoustic signal will be measured with respect to the bearing condition. To capture the dynamic nature of friction evolution, we propose to extract the temporal transient features from the sensing data and develop Hidden Markov Models with four distinct states associated with four operation conditions of the ball bearing. Our system uniquely combine both physics-based and stochastic models for the online diagnosis.
- API data.nasa.gov | Last Updated 2018-07-19T07:32:14.000Z
A trajectory design tool is sought to leverage chaos and nonlinear dynamics present in multi-body gravitational fields to design ultra-low energy transfer trajectories, with applications to continuously thrusting spacecraft. Specifically invariant manifolds associated with liberation points will be leveraged in an algorithm to generate initial solutions which will be fed into higher fidelity optimization tools. The tool will be used in a case study to design an interplanetary transfer trajectory for a CubeSat using solar electric propulsion. By combining the inherent efficiency of solar electric propulsion, with the fuel savings available through invariant manifold trajectory design, it is expected the required fuel will be cut significantly, as compared to spacecraft using chemical rockets and Hohmann transfers. The research will contribute to the proliferation of new in-space propulsion systems by providing a simulation-based design tool specifically targeted at such systems. Thus the research answers the call of TABS sections 2.4, In-Space Propulsion Supporting Technology, and 11.18 Simulation Based Systems Engineering. Furthermore, as the algorithm is computationally improved, the trajectory software may be implemented onboard spacecraft, enabling online trajectory design and optimization. Therefore the research meets the call of TABS section 4.5, Autonomy. Finally, ultra-low energy trajectories can be used to cheaply send scouting spacecraft for precursor missions. CubeSat missions, enabled by the proposed research, could serve to study and map human exploration destinations prior to human arrival. Thus the proposed research meets the calls for Destination, Reconnaissance and Mapping, as in section 7.1.1, as well as Modeling, Simulations and Destination Characterization, as in section 7.6.1.
- API data.nasa.gov | Last Updated 2018-08-02T15:26:18.000Z
<p>The primary objective of this activity is to develop, design, and test (DD&T) the QUAD-core siTARA (QUATARA) computer to distribute computationally intensive processes such as: communication, sensors, attitude determination, attitude control, cameras, robotic manipulators, and science payloads. An example of the current state-of-the art for a COTS CubeSat flight computer is, a 16 bit 80 MHz Microchip dsPIC33 microcontroller capable of managing the satellite attitude determination, control system, communication system, power, and science payloads. Adding more capability to these COTS flight computers required the development, under a previous CIF proposal, of the Modular Attitude Determination System (MADS) board. MADS lessened the I/O load from the flight computer so it could focus on higher priority tasks such as managing a Real-Time Operating System (RTOS) or carrying out an attitude control algorithm. The MADS board utilized a 16 bit 80 MHz Texas Instruments ARM Cortex-M4 Stellaris microcontroller to execute the attitude determination algorithm independently of the dsPIC33 flight computer. Once the MADS board processes the data, the dsPIC33 receives the estimated states and determines the desired attitude control.</p><p>The addition of cameras, proximity sensors, robotic manipulators, thruster systems, complex science payloads and video guidance systems, would cause current CubeSat flight computers to be overwhelmed. Because of the desire to expand the capabilities of CubeSats, the innovation of the QUATARA architecture enhances the capabilities of data handling and computer processing by replacing the 16 bit 80 MHz microcontrollers with four 64 bit 1 GHz microprocessors. The QUATARA allows for tasks to be processed at a faster rate not only because of the difference in clock speed between the platforms but also because of the fact that there are four individual microprocessors which can run these tasks independently without the need to serialize the execution of the code like in a single microcontroller.</p><p>The QUATARA computer aims to be fault-tolerant by means of a software voting scheme to guard against the effects of Single Event Effects (SEE) such as Single Event Upsets (SEU). Each ‘node’ (Gumstix Computer-On-Modules (COM)) of the QUATARA computer will be connected to its own set of sensors and actuators. These individual nodes will collect their respective data and share it between themselves over a data bus (such as RS-485). Once each node has all the data from all of the other nodes it will process it and come up with a result. This result can then be used to determine if a node is considered as ‘failed’ and that node then needs to be disabled, (this can be done by ignoring future data received from that node or by completely shutting it off). In the case a node is lost a support node is available to be switched in for the failed node. This support node will focus on low priority tasks, (such as housekeeping), if it is not required as a voting node. Synchronization between the nodes can be maintained by having a precise timing source on each of the processors, (such as a ticking timer interrupt routine), that ticks at a set time interval. This timing information will be passed between the nodes and the tick rate of the interrupt routine will be modified as required to ensure that all of the nodes data remains in sync.</p>
- API data.nasa.gov | Last Updated 2018-07-19T07:26:31.000Z
<p>The Planetary Instrument Definition and Development Program (PIDDP) supports the advancement of spacecraft-based instrument technology that shows promise for use in scientific investigations on future planetary missions. The goal is to define and develop scientific instruments or components of such instruments to the point where the instruments may be proposed in response to future announcements of flight opportunity without additional extensive technology development.<p/><p>Results of PIDDP have contributed to the development of flight hardware flown on, or selected for, many of NASA's planetary missions. The instrument technology selected through PIDDP addresses specific scientific objectives of likely future science missions. Instrument definition and development studies take place at several stages, including feasibility studies, conceptual design, laboratory breadboarding, brassboarding, and testing of critical components and complete instruments. The technology readiness level (TRL) that PIDDP supports is TRL 1-6. For immature or particularly complex new instruments, proposers initially may choose to only define or develop the most risky components. When the proposed effort is for a component only, the proposed effort describes one or more likely scenarios for possible follow-on instrument development. Scientific objectives of the instruments, proposed follow-on instruments, and future candidate missions are discussed in the proposal for each selected activity. It is the responsibility of the proposer to demonstrate how their proposed instruments address significant scientific questions relevant to stated NASA goals and not for NASA to attempt to infer this.</p>
- API data.nasa.gov | Last Updated 2018-07-19T15:33:46.000Z
The NASA Earth Exchange (NEX) Downscaled Climate Projections (NEX-DCP30) dataset is comprised of downscaled climate scenarios for the conterminous United States that are derived from the General Circulation Model (GCM) runs conducted under the Coupled Model Intercomparison Project Phase 5 (CMIP5) [Taylor et al. 2012] and across the four greenhouse gas emissions scenarios known as Representative Concentration Pathways (RCPs) [Meinshausen et al. 2011] developed for the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR5). The dataset includes downscaled projections from 33 models, as well as ensemble statistics calculated for each RCP from all model runs available. The purpose of these datasets is to provide a set of high resolution, bias-corrected climate change projections that can be used to evaluate climate change impacts on processes that are sensitive to finer-scale climate gradients and the effects of local topography on climate conditions. Each of the climate projections includes monthly averaged maximum temperature, minimum temperature, and precipitation for the periods from 1950 through 2005 (Retrospective Run) and from 2006 to 2099 (Prospective Run).
- API data.nasa.gov | Last Updated 2018-07-19T09:16:33.000Z
Munro offer an innovatiive, intelligent, fully integrated hardware and software cockpit system solution for handling many General Aviation (GA) and UAV emergencies so as to minimize NextGen ATM disruption while saving lives. This ADS-B-ER system will provide GA airplanes and UAVs automated -ER trajectories to the nearest suitable airport avoiding terrain/obstacles, hazardous weather and restricted airspace.
- API data.nasa.gov | Last Updated 2018-07-19T15:57:23.000Z
JEM Engineering proved the technical feasibility of the FlexScan array?a very low-cost, highly-efficient, wideband phased array antenna?in Phase I, and stands ready to develop it into a fully-functional, flight-qualifiable prototype in Phase II. JEM developed an S-Band (2.0-2.3 GHz) antenna array design appropriate for the stratospheric balloon application through requirements definition, modeling, and performance predictions. The critical technology for this array is an electrically-controlled Variable Delay Line (VDL), used to provide true time-delay for beamsteering. VDLs were designed, built and tested, and shown to have excellent performance. The VDLs were tested over 2.4 million cycles without degradation, indicating good life, especially for the balloon application. A 4-port linear beamformer was built, and used to validate the beamformer concept. The objective of the proposed 24-month Phase II effort is to develop, prototype, and demonstrate a flight-qualifiable FlexScan phased array that achieves the bandwidth, antenna gain, and scan range required for a balloon-borne TDRSS data link in S-band, while meeting environmental requirements. Upon completion of Phase II, the FlexScan array will be ready to commercialize for the balloon-borne application, with other NASA and non-NASA commercial applications soon to follow.
Global Flood Risk From Advanced Modeling and Remote Sensing in Collaboration With Google Earth Enginedata.nasa.gov | Last Updated 2018-07-19T07:18:46.000Z
As predictive accuracy of the climate response to greenhouse emissions improves, measurements of sea level rise are being coupled with modeling to better understand coastal vulnerability to flooding. Predictions of rising intensity of storm rainfall and larger tropical storms also imply increased inland flooding, and many studies conclude this is already occurring in some regions. Most rivers experience some flooding each year: the seasonal discharge variation from low to high water can be 2-3 orders of magnitude. The mean annual flood is an important threshold: its level separates land flooded each year from land only affected by large floods. We lack adequate geospatial information on a global basis defining floodplains within the mean annual flood limit and the higher lands still subject to significant risk (e.g. with exceedance probability of greater than 3.3%; the 30 yr floodplain). This lack of knowledge concerning changing surface water affects many disciplines and remote sensing data sets, where, quite commonly, a static water 'mask' is employed to separate water from land. For example, inland bio-geochemical cycling of C and N is affected by flooding, but floodplain areas are not well constrained. Measurements and computer models of flood inundation over large areas have been difficult to incorporate because of a scarcity of observations in compatible formats, and a lack of the detailed boundary conditions, in particular floodplain topography, required to run hydrodynamic models. However, the available data now allow such work, and the computational techniques needed to ingest such information are ready for development. Optical and SAR sensing are providing a near-global record of floodplain inundation, and passive microwave radiometry is producing a calibrated record of flood-associated discharge values, 1998-present. Also, global topographic data are of increasingly fine resolution, and techniques have been developed to facilitate their incorporation into modeling. Several of us have already demonstrated the new capability to accurately model and map floodplains on a continent scale using input discharges of various sizes and exceedance probabilities. Work is needed to accomplish global-scale products, wherein results are extended to all continents, and downscaled to be locally accurate and useful. Floodplain mapping technologies and standards vary greatly among nations (many nations have neither): the planned effort will provide a global flood hazard infrastructure on which detailed local risk assessment can build. Our project brings together an experienced team of modeling, remote sensing, hydrology, and information technology scientists at JPL and the University of Colorado with the Google Earth Engine team to implement and disseminate a Global Floodplains and Flood Risk digital map product. This project addresses major priorities listed in the AIST program: with Google, we would identify, develop, and demonstrate advanced information system technologies that increases the accessibility and utility of NASA science data and enables new information products. The work will address the Core Topic 'Data-Centric Technologies', including 'Technologies that provide opportunities for more efficient interoperations with observations data systems, such as high end computing and modeling systems; and Capabilities that advance integrated Earth science missions by enabling discovery and access to Service Oriented Architecture'. It will also address the Special Subtopic 'Technology Enhancements for Applied Sciences Applications' in regard to natural disasters, and contribute to the GEOSS architecture for the use of remote sensing products in disaster management and risk assessment.
- API data.nasa.gov | Last Updated 2018-07-19T12:43:08.000Z
Space-based imaging sensors are important for NASA's mission in both performing scientific measurements and producing literature and documentary cinema. The recent proliferation of high-definition capture devices and displays (HDTV) provide the general public with first-hand human experiences hundreds miles above sea level in brilliant detail. The recent IMAX film "Hubble," which features one of the final space shuttle missions to repair the orbital telescope, is a prime example. The core of current space-based video capture devices consist of digital imaging sensors. Unfortunately, the harsh conditions of space limit the lifespan of all the imaging sensors, in addition to other electronics. Consequently, NASA is seeking innovative technologies for space-based applications to extend the operational life of these systems to three years or more. In this SBIR project, we propose to investigate robust image reconstruction based on novel signal processing techniques in the vein of compressed sensing (CS) to mitigate pixel damage to the point that is imperceptible by the human eye. Specifically, this proposal is a response to the solicitation for radiation-hardened programmable encoding technology as an identified mid-term NASA solution. CS is a recently introduced novel framework that goes against the traditional data acquisition paradigm. CS demonstrates that a sparse, or compressible, signal can be acquired using a low rate acquisition process that projects the signal onto a small set of vectors incoherent with the sparsity basis. This approach is divided into encoder and decoder stages. We propose performing the encoding in-line with acquisition using a low-SWaP, radiation-tolerant FPGA. The robust reconstruction will occur back on Earth where high-performance GPU-accelerated workstations can be used. A benefit of our solution is that it does not require a modification to the original imaging system.