Links – Related Pages:
Press releases across
the HiPACC Consortium
Data Science Press by Institution:

 

HiPACC Data Science Press Room Archive

The Data Science Press Room highlights computational and data science news in all fields *outside of astronomy* in the UC campuses and DOE laboratories comprising the UC-HiPACC consortium. The wording of the short summaries on this page is based on wording in the individual releases or on the summaries on the press release page of the original source. Images are also from the original sources except as stated. Press releases below appear in reverse chronological order (most recent first); they can also be displayed by UC campus or DOE lab by clicking on the desired venue at the bottom of the left-hand column. This page is the archive. Click for current data science press releases.

October 9, 2014 — Counting crows—and more

For the birds: one-click data-intensive science
Male Scarlet Tanager (Piranga olivacea), a vibrant songster of eastern hardwood forests. These long-distance migrants move all the way to South America for the winter. Credit: Kelly Colgan Azar
UCSB 10/9/2014—As with the proverbial canary in a coal mine, birds are often a strong indicator of environmental health. Over the past 40 years, many species have experienced their own environmental crisis due to habitat loss and climate change. To fully understand bird distribution relative to environment requires extensive data beyond those amassed by a single institution. Enter DataONE: the Data Observation Network for Earth, a collaboration of distributed organizations with data centers and science networks, including the Knowledge Network for Biocomplexity (KNB) administered by UC Santa Barbara’s National Center for Ecological Analysis and Synthesis (NCEAS). Funded in 2009 as one of the initial NSF DataNet projects, DataONE has enhanced the efficiency of synthetic research—research that synthesizes data from many sources—enabling scientists, policymakers and others to more easily address complex questions about the environment. In its second phase, DataONE will target goals that enable scientific innovation and discovery while massively increasing the scope, interoperability and accessibility of data.

View UCSB Data Science Press Release

October 9, 2014 — UCLA receives $11 million grant to lead NIH Center of Excellence for Big Data Computing

New UCLA center for managing big biomed datasets
Peipei Ping, principal investigator for UCLA’s new Center of Excellence for Big Data Computing. Credit: UCLA
UCLA 10/9/2014—The National Institutes of Health (NIH) has awarded UCLA $11 million to form a Center of Excellence for Big Data Computing. The Center will develop new strategies for mining and understanding the mind-boggling surge in complex biomedical data sets. The grant to UCLA was part of an initial $32 million outlay for the NIH’s $656 million Big Data to Knowledge (BD2K) initiative. As one of 11 centers nationwide, UCLA will create analytic tools to address the daunting challenges facing researchers in accessing, standardizing and sharing scientific data to foster new discoveries in medicine. Investigators also will train the next generation of experts and develop data science approaches for use by scientists. A key focus for the UCLA center will be creating and testing cloud-based tools for integrating and analyzing data about protein markers linked to cardiovascular disease. The center’s findings will help shape guidelines for future data integration and analysis, and the management of data from electronic health records.

View UCLA Data Science Press Release

October 9, 2014 — UC Santa Cruz leads $11 million Center for Big Data in Translational Genomics

New UCSC center for managing big genomic datasets
Credit: Elena Zhukova
UCSC 10/9/2014—The National Institutes of Health (NIH) has awarded $11 million to UC Santa Cruz to create the technical infrastructure needed for the broad application of genomics in medicine and biomedical research. This grant from the National Human Genome Research Institute (NHGRI) funds the Center for Big Data in Translational Genomics, a multi-institutional partnership based at UC Santa Cruz and led by David Haussler, professor of biomolecular engineering and director of the UC Santa Cruz Genomics Institute. The Center’s overarching goal is to help the biomedical community use genomic information to better understand human health and disease. To do this, scientists must be able to share and analyze genomic datasets that are orders of magnitude larger than those that can be handled by the existing infrastructure. Advances in DNA sequencing technology have made it increasingly affordable to sequence a person's entire genome, but managing genomic and related data from millions of individuals is a daunting challenge. The Center for Big Data in Translational Genomics will develop new protocols and tools for genomic data and test them in four pilot projects.

View UCSC Data Science Press Release

October 7, 2014 — Bio researchers receive patent to fight superbugs

Computing how to kill superbugs
Lawrence Livermore National Laboratory scientists (front row left to right) Matt Coleman and Feliza Bourguet and (back row left to right) Brian Souza and Patrik D'haeseleer received a patent for developing a computational genomic technique to fight superbugs (antibiotic-resistant bacteria). Credit: Julie Russell/LLNL
LLNL 10/7/2014—Superbugs, or antibiotic-resistant bacteria, have been on the rise since antibiotics were first introduced 80 years ago, as a result of antibiotics having been being overprescribed and misused, allowing bacteria pathogens to develop immunities against them. As a result, superbugs sicken nearly 2 million Americans each year, of whom roughly 23,000 die. Lawrence Livermore National Laboratory (LLNL) scientists have now received a patent for producing antimicrobial compounds that degrade and destroy antibiotic-resistant bacteria by using a pathogen’s own genes against it. Their technique uses computational tools and genome sequencing to identify which genes inside a bacterium encode for lytic proteins—enzymes that normally produce nicks in cell walls that allow cells to divide and multiply. But in high concentrations, the enzymes rapidly degrade and rupture cell walls. Lytic proteins circumvent any defenses that bacteria have developed. The LLNL approach can be used to fight superbugs such as antibiotic-resistant E. coli, Salmonella, Campylobacter, Methicillin-resistant Staphylococcus aureus (MRSA), Bacillus anthracis and many others.

View LLNL Data Science Press Release

October 6, 2014 — RCSB Protein Data Bank launches mobile application

Proteins on the go—for free
Credit: RCSB Protein Data Bank
SDSC 10/6/2014—The RCSB Protein Data Bank (PDB), which recently archived its 100,000th molecule structure, has introduced a free mobile application device that enables the general public and expert researchers to quickly search and visualize the 3D shapes of proteins, nucleic acids, and molecular machines. “As the mobile web is starting to surpass desktop and laptop usage, scientists and educators are beginning to integrate mobile devices into their research and teaching,” said Peter Rose, a researcher with the San Diego Supercomputer Center (SDSC) at UC San Diego, and Scientific Lead with the RCSB PDB. “In response, we have developed this application for iOS and Android mobile platforms to enable fast and convenient access to RCSB PDB data and services.” The goal was to produce an intuitive app with a simple search interface, quick browsing of search results, a view of basic data about a structure entry and its Pub-Med abstract, and high-performance molecular visualization. In addition, the app provides access to the RCSB PDB Molecule of the Month educational series, and can be used to store personal notes and annotations.

View SDSC Data Science Press Release

October 6, 2014 — The ocean’s future

The ocean’s future?
This San Miguel Island rock wall is covered with a diverse community of marine life. Credit: Santa Barbara Coastal Long-Term Ecological Research Program
UCSB 10/6/2014—Is life in the oceans changing over the years? Are humans causing long-term declines in ocean biodiversity with climate change, fishing and other impacts? At present, scientists are unable to answer these questions because little data exist for many marine organisms, and the small amount of existing data focuses on small, scattered areas of the ocean. A group of researchers from UCSB, the United States Geological Survey (USGS), National Oceanic and Atmospheric Administration (NOAA) National Marine Fisheries Service, and UC San Diego’s Scripps Institution of Oceanography is creating a new prototype system—the Marine Biodiversity Observation Network—to solve this problem. The network will integrate existing data over large spatial scales using geostatistical models and will utilize new technology to improve knowledge of marine organisms. UCSB’s Center for Bio-Image Informatics will use advanced image analysis to automatically identify different species including fish. In addition to describing patterns of biodiversity, the project will use mathematical modeling to examine the value of information on biodiversity in making management decisions as well as the cost of collecting that information in different ways. The five-year $5 million project will center on the Santa Barbara Channel, but the long-term goal is to expand the network around the country and around the world to track over time the biodiversity of marine organisms, from microbes to whales.

View UCSB Data Science Press Release

October 6, 2014 — Genomic sequencing research funded by Moore Foundation award

UCD 10/6/2014—The Gordon and Betty Moore Foundation has announced the selection of 14 Moore Investigators in Data-Driven Discovery. Among them is C. Titus Brown, visiting associate professor of population health and reproduction at the UC Davis School of Veterinary Medicine. Brown will be awarded $1.5 million over five years to support his research on genomic sequencing at UCD when he arrives in January 2015. Brown uses novel computer science tools to explore large genomic sequencing data sets. While at UCD, he will lead the Laboratory of Data Intensive Biology, which works across developmental biology, molecular biology, bioinformatics, metagenomics, and next-generation sequencing to build better biological understanding. With the grant, Brown is planning to build open software to help biologists discover patterns in large distributed data sets—key to understanding ecology and evolution.

View UCD Data Science Press Release

October 5, 2014 — Livermore scientists suggest ocean warming in Southern Hemisphere underestimated

Southern Hemisphere ocean warming underestimated
The Antarctic Ocean is a remote place where icebergs frequently drift off the Antarctic coast and can be seen during their various stages of melting. This iceberg, sighted off the Amery Ice Shelf, also has bands of translucent blue ice formed by sea or freshwater freezing in bands between layers of more compressed and white glacial ice. Credit: Andrew Meijers/BAS
LLNL 10/5/2014—Using satellite observations and a large suite of climate models, Lawrence Livermore scientists have found that long-term ocean warming in the upper 700 meters of Southern Hemisphere oceans has likely been underestimated. “This underestimation is a result of poor sampling prior to the last decade and limitations of the analysis methods that conservatively estimated temperature changes in data-sparse regions,” said LLNL oceanographer Paul Durack, lead author of a paper appearing in the October 5 issue of the journal Nature Climate Change. Ocean heat storage is important because it accounts for more than 90 percent of the Earth’s excess heat that is associated with global warming. The observed ocean and atmosphere warming is a result of continuing greenhouse gas emissions. The Southern Hemisphere oceans make up 60 percent of the world’s oceans. The results suggest that global ocean warming has been underestimated by 24 to 58 percent. The conclusion that warming has been underestimated agrees with previous studies; however, this study is the first time that scientists have tried to quantify how much heat has been missed.

View LLNL Data Science Press Release

October 3, 2014 — Three faculty members awarded National Medal of Science

Two mathematicians win National Medal of Science
The late David Blackwell, former professor of mathematics and statistics
UCB 10/3/2014—Three UC Berkeley, faculty members were selected Oct. 3 by President Barack Obama to receive the National Medal of Science, the nation’s highest honor for a scientist. Two were mathematicians: Alexandre J. Chorin, 76, University Professor emeritus of mathematics and statistician David Blackwell (who died in 2010 at the age of 91). Chorin, who also is a Senior Faculty Scientist in the Mathematics Group at Lawrence Berkeley National Laboratory, introduced powerful new computational methods for the solution of problems in fluid mechanics. His methods are widely used to model airflow over aircraft wings and in turbines and engines, water flow in oceans and lakes, combustion in engines, and blood flow in hearts and veins. His methods have also contributed to the theoretical understanding of turbulent flow. Blackwell, the first black admitted to the National Academy of Sciences and the first tenured black professor in UC Berkeley history, was a mathematician and statistician who contributed to numerous fields, including probability theory, game theory and information theory. He chaired UC Berkeley’s Department of Statistics and served as president in 1955 of the Institute of Mathematical Statistics, an international professional and scholarly society.

View UCB Data Science Press Release

October 3, 2014 — Research opportunity announced for Quantum Artificial Intelligence Laboratory

NASA Ames 10/3/2014—The Universities Space Research Association (USRA) has announced a call for proposals to utilize the D-Wave Two quantum computer at NASA’s Quantum Artificial Intelligence Laboratory, located at the NASA Advanced Supercomputing facility at NASA Ames Research Center. The projects selected will have access to the computer from November 2014 through September 2015 in order to research artificial intelligence algorithms and advanced programming techniques for quantum annealing. The deadline for proposals is October 31, 2014.

NASA Ames link goes to USRA: http://www.usra.edu/quantum/rfp/

October 2, 2014 — Why we can’t tell a Hollywood heartthrob from his stunt double

Will the real Caribbean pirate please stand up?
A new study explains the visual mechanism behind our inability to tell actors from their stunt doubles, among other things.
UCB 10/2/2014—Johnny Depp has an unforgettable face. Tony Angelotti, his stunt double in Pirates of the Caribbean, does not. So why is it that when they’re swashbuckling on screen, audiences worldwide see them both as the same person? UC Berkeley scientists have cracked that mystery. Researchers have pinpointed the brain mechanism by which we latch on to a particular face even when it changes. In searching for an exact match to a “target” face on a computer screen, study participants consistently identified a face that was not the target face, but a composite of the faces they had seen over the past few seconds. Moreover, participants judged the match to be more similar to the target face than it really was. While it may seem as though our brain is tricking us into morphing, say, an actor with his stunt double, this “perceptual pull” is actually a survival mechanism, giving us a sense of stability, familiarity and continuity in what would otherwise be a visually chaotic world, researchers point out. The study was published Thursday, Oct. 2 in the online edition of the journal, Current Biology.

View UCB Data Science Press Release

October 2, 2014 — Cybertools offer new channels for free speech, but grassroots organizing still critical

Cybertools aid free speech
Kweku Opoku-Agyemang, a Development Impact Lab Postdoctoral Fellow at UC Berkeley’s Blum Center for Developing Economies, explains how the use of mobile technology that automates survey-taking can help expand the political power of the general public in Ghana. Credit: UC Berkeley video produced by Roxanne Makasdjian and Phil Ebiner
UCB 10/2/2014—Communication tools today have changed social movements since the Free Speech Movement of 50 years ago. Online petitions or survey software make it easier for users to register their opinions for elected officials while fast response times enable organizers to orchestrate logistics or respond to developing events. Reported efforts by Chinese officials to censor news of the current pro-democracy protests in Hong Kong by disrupting access to Instagram and removing references to the demonstrations illustrate the degree to which social media is seen as a threat. On the internet, however, every action leaves a trace, opening up the potential for surveillance. Cybersecurity experts have identified malware called Xsser that infects the operating systems of Apple mobile devices. The virus, with code written in Chinese, is capable of stealing text messages, call logs, photos and passwords. Experts believe Xsser is targeting pro-democracy protesters in Hong Kong. Without anonymity, the users are vulnerable to being tracked and persecuted.

View UCB Data Science Press Release

October 2, 2014 — Study of mountain lion energetics shows the power of the pounce

Getting a jump on wildlife management
In the background of this illustration are typical SMART collar accelerometer traces for walking and then running, while the foreground shows a collared puma chasing a black-tailed deer. Credit: Corlis Schneider
UCSC 10/2/2014—Scientists at UC Santa Cruz, using a new wildlife tracking collar developed by a computer engineering graduate student, were able to continuously monitor the movements of mountain lions in the wild and determine how much energy the big cats use to stalk, pounce, and overpower their prey. The research team’s findings, published October 3 in Science, help explain why most cats use a “stalk and pounce” hunting strategy. The new Species Movement, Acceleration, and Radio Tracking (SMART) wildlife collar—equipped with GPS, accelerometers, and other high-tech features—tells researchers not just where an animal is but what it is doing and how much its activities “cost” in terms of energy expenditure. Understanding the energetics of wild animals moving in complex environments is valuable information for developing better wildlife management plans.

View UCSC Data Science Press Release

October 1, 2014 — UCSC Ebola genome browser now online to aid researchers’ response to crisis

New Ebola bioinformatics tool: genome browser
The UCSC Ebola Genome Portal contains links to the newly created Ebola browser and to scientific literature on the deadly virus.
UCSC 10/1/2014—A new Ebola bioinformatics tool—a genome browser to assist global efforts to develop a vaccine and antiserum to help stop the spread of the Ebola virus—was released on September 30 by the UC Santa Cruz Genomics Institute. UCSC has established the UCSC Ebola Genome Portal, with links to the new Ebola genome browser as well as links to all the relevant scientific literature on the virus. Scientists around the world can access the open-source browser to compare genetic changes in the virus genome and areas where it remains the same. The browser allows scientists and researchers from drug companies, other universities, and governments to study the virus and its genomic changes as they seek a solution to halt the epidemic. In a similar marshaling of forces in the face of a worldwide threat 11 years ago, UCSC researchers created a SARS virus browser.

View UCSC Data Science Press Release

October 1, 2014 — SDSC granted $1.3 million award for ‘SeedMe.org’ data sharing infrastructure

Sharing Big Data—easily
This image and related research data, one of numerous projects being shared and stored using SeedMe, shows a simple model of a geodynamo used for benchmark codes. The view is from center toward one of the poles, and the cones show convective flow toward higher temperature (light green to dark green with increasing velocity) in a spiraling form caused by rotation. The shells of various colors depict temperature, increasing from the outer boundary towards the interior. Credit: Amit Chourasia, Ashley Willis, Maggie Avery, Chris Davies, Catherine Constable, David Gubbins
SDSC 10/1/2014—Researchers at the San Diego Supercomputer Center (SDSC) at UC San Diego have received a three-year, $1.3 million award from the National Science Foundation (NSF) to develop a web-based resource that lets scientists seamlessly share and access preliminary results and transient data from research on a variety of platforms, including mobile devices. Called Swiftly Encode, Explore and Disseminate My Experiments (SeedMe), the new award is from NSF’s Data Infrastructure Building Blocks (DIBBs) program, part of the foundation’s Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21). Current methods for sharing and assessing transient data and preliminary results are cumbersome, labor intensive, and largely unsupported by useful tools and procedures. “SeedMe provides an essential yet missing component in current high-performance computing as well as cloud computing infrastructures,” said SDSC Director Michael Norman, co-principal investigator on the project.

View SDSC Data Science Press Release

September 30, 2014 — NIH taps lab to develop sophisticated electrode array system to monitor brain activity

Networking your brain
LLNL is developing an advanced electronics system to monitor and modulate neurons, to be packed with more than 1,000 tiny electrodes embedded in different areas of the brain to record and stimulate neural circuitry.
LLNL 9/30/2014—The National Institutes of Health (NIH) awarded Lawrence Livermore National Laboratory (LLNL) a grant to develop an electrode array system that will enable researchers to better understand how the brain works through unprecedented resolution and scale. LLNL’s project is part of NIH’s efforts to support President Obama’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, a new research effort to revolutionize our understanding of the human mind and uncover ways to treat, prevent and cure brain disorders. LLNL’s goal is to develop a system that will allow scientists to simultaneously study how thousands of neuronal cells in various brain regions work together during complex tasks such as decision making and learning. The biologically compatible neural system will be the first of its kind to have large-scale network recording capabilities that are designed to continuously record neural activities for months to years.

View LLNL Data Science Press Release

September 30, 2014 — NIH awards UC Berkeley $7.2 million to advance brain initiative

Funding the brain
A section of the sensory cortex of a mouse in which cells known as long range projection neurons have been genetically modified to express a green fluorescent protein. John Ngai and colleagues will use single-cell genomics techniques to reveal the diversity of these and other neurons in the brain. Credit: David Taylor and Hillel Adesnik
UCB 9/30/2014—The National Institutes of Health today announced its first research grants through President Barack Obama’s BRAIN Initiative, including three awards to UC Berkeley (UCB), totaling nearly $7.2 million over three years. Among these is a new $5.6 million public-private collaboration between Carl Zeiss Microscopy and UCB to support the Berkeley Brain Microscopy Innovation Center (BrainMIC), which will fast-track microscopy development for emerging neurotechnologies and will run an annual course to teach researchers how to use the new technologies. Part of the Helen Wills Neuroscience Institute, the program will generate innovative devices and analytic tools in engineering, computation, chemistry and molecular biology to enable transformative brain science from studies of human cognition to neural circuits in model organisms.

View UCB Data Science Press Release

September 29, 2014 — At the interface of math and science

Simulating soft matter
Model of vesicle adhesion, rupture and island dynamics during the formation of a supported lipid bilayer (from work by Atzberger et al.) featured on the cover of the journal Soft Matter. Credit: Peter Allen
UCSB 9/29/2014—New mathematical approaches—developed by Paul Atzberger, UC Santa Barbara professor of mathematics and mechanical engineering and his graduate student Jon Karl Sigurdsson, and other coauthors—reveal insights into how proteins move around within lipid bilayer membranes. These microscopic structures can form a sheet that envelopes the outside of a biological cell in much the same way that human skin serves as the body’s barrier to the outside environment. “It used to be just theory and experiment,” Atzberger said. “Now computation serves an ever more important third branch of science. With simulations, one can take underlying assumptions into account in detail and explore their consequences in novel ways. Computation provides the ability to grapple with a level of detail and complexity that is often simply beyond the reach of pure theoretical methods.” Their work was published in Proceedings of the National Academy of Science (PNAS) and featured on the cover of the journal Soft Matter.

View UCSB Data Science Press Release

September 25, 2014 — Climate, Earth system project draws on science powerhouses

ACME of climate modeling
Computer modeling provides policymakers with essential information on such data as global sea surface temperatures related to specific currents.
LANL 9/25/2014—The US Department of Energy national laboratories are teaming with academia and the private sector to develop the most advanced climate and Earth system computer model yet created. Accelerated Climate Modeling for Energy (ACME) is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications. The project—which includes seven other national laboratories, four academic institutions, and one private-sector company—will focus initially on three climate-change science drivers and corresponding questions to be answered during the project's initial phase: water cycle, biogeochemistry, and cryosphere systems. Over a planned decade, the project aims to conduct simulations and modeling on the most sophisticated high-performance computing systems machines as they become available: 100+ petaflop machines and eventually exascale supercomputers.

View LANL Data Science Press Release

September 25, 2014 — Three Bay Area institutions join forces to seed transformative brain research

Seeding high-risk brain research
Michel Maharbiz of electrical engineering and computer science describes a project to probe more deeply into the cerebral cortex. Credit: Roy Kaltschmidt/LBNL
UCB 9/25/2014—UC Berkeley, UC San Francisco, and Lawrence Berkeley National Laboratory (LBNL) each put up $1.5 million over three years to seed innovative but risky research in a one-of-a-kind collaboration called BRAINseed. Among their projects is one for development of instrumentation and computational methods. Though great progress has been made in mapping the function of the human brain, researchers have been stymied by limitations in both recording devices and the ability to analyze and understand brain signals. UCSF’s Edward F. Chang, M.D., is leading a team that aims to achieve up to a thousandfold increase in the density and electronic sophistication of recording arrays. The vast amount of data collected by these arrays will be stored and analyzed by some of the world’s most powerful computers at the National Energy Research Scientific Computing Center (NERSC), enabling a new level of understanding of the brain in both health and disease. Chang’s collaborators are Peter Denes and Kristofer Bouchard of LBNL and Fritz Sommer of UCB.

View UCB Data Science Press Release

September 24, 2014 — Human genome was shaped by an evolutionary arms race with itself

Evolutionary “arms race” shaped our genomes
An evolutionary arms race has shaped the genomes of primates, including humans. Credit: David Greenberg
UCSC 9/28/2014— New findings by scientists at UC Santa Cruz suggest that an evolutionary arms race between rival elements within the genomes of primates drove the evolution of complex regulatory networks that orchestrate the activity of genes in every cell of our bodies. The arms race is between mobile DNA sequences known as “retrotransposons”—nicknamed “jumping genes”—and the genes that have evolved to control them. The UC Santa Cruz researchers have, for the first time, identified genes in humans that make repressor proteins to shut down specific jumping genes. The researchers also traced the rapid evolution of the repressor genes in the primate lineage. The study involved close collaboration between a “wet lab” for developing genetic assays and a “dry lab” where researchers used computational tools of genome bioinformatics to reconstruct the evolutionary history of primate genomes. Their findings are published in the September 28 issue of Nature.

View UCSC Data Science Press Release

September 23, 2014 — NERSC helps corroborate two distinct mechanisms in ferroelectric material

Driving oxygen through exotic material
Collaborators from Korea, Norway, Ukraine and the United States analyzed atomic-scale polarization behavior and chemical composition for a ferroelectric (BFO) film on a metal (LSMO) to reveal electrically driven chemical changes that may someday be manipulated in novel oxide electronic devices. Credit: Y.-M. Kim et al./ORNL
NERSC 9/23/2014—Complex oxide crystals—which combine oxygen atoms with assorted metals—have long tantalized the materials science community with their promise in next-generation energy and information technologies. Because their electrons interact strongly with their environments, complex oxides are versatile, existing as insulators, metals, magnets and superconductors. They can tightly couple diverse physical properties, such as stress and strain, magnetism and magnetic order, electric field and polarization. In highly correlated electron systems, physical properties are interconnected like a tangle of strings: often pulling one string takes others with it. Increased understanding of the properties of complex oxides will improve the ability to predict and control materials for new energy technologies. One project has already led to a surprising discovery—that intrinsic electric fields can drive oxygen diffusion at interfaces in engineered thin films made of complex oxides. An Oak Ridge National Laboratory research team used supercomputing resources at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) to help confirm their findings. The study is published in the September issue of Nature Materials.

View NERSC Data Science Press Release

September 23, 2014 — UC Santa Cruz establishes Symantec Presidential Chair in Storage and Security

$1 million for new chair in data storage/security
Ethan Miller directs the UC Santa Cruz Center for Research in Storage Systems. Credit: C. Lagattuta
UCSC 9/23/2014—A $500,000 gift to UC Santa Cruz from computer security company Symantec, plus matching funds from the UC Presidential Match for Endowed Chairs, will establish a $1 million endowment at UC Santa Cruz for the Symantec Presidential Chair in Storage and Security at UCSC’s Baskin School of Engineering. The endowed chair supports research and teaching in the engineering school’s Department of Computer Science, which has strong programs in computer security and data storage systems. Ethan Miller, professor of computer science and director of the Center for Research in Storage Systems at UC Santa Cruz, has been appointed as the inaugural holder of the new chair. The Baskin School of Engineering is home to world-class faculty in data storage systems and other key areas of data science. Symantec’s gift is a significant contribution to the Data Science Leadership Initiative of the $300-million Campaign for UC Santa Cruz.

View UCSC Data Science Press Release

September 23, 2014 — Los Alamos researchers uncover new properties in nanocomposite oxide ceramics for reactor fuel, fast-ion conductors

Misfits rule (in nanocomposite oxide ceramics)
Schematic depicting distinct dislocation networks for SrO- and TiO2-terminated SrTiO3/MgO interface
LANL 9/23/2014—Nanocomposite oxide ceramics have potential uses as ferroelectrics, fast ion conductors, and nuclear fuels and for storing nuclear waste. A composite is a material containing grains, or chunks, of several different materials; in a nanocomposite, the size of each grain is on the order of nanometers, roughly 1000 times smaller than the width of a human hair. Interfaces where the different materials meet are regions of unique electronic and ionic properties, which could enhance conductivity of batteries and fuel cells. Los Alamos National Laboratory (LANL) simulations that explicitly account for the position of each atom within two different materials reveal that some interfaces exhibit remarkably different atomic structures: misfit dislocations that form when the two materials do not exactly match in size dictate the functional properties of the interface, such as the conductivity. The observed relationship between the termination chemistry and the dislocation structure of the interface offers potential avenues for tailoring transport properties and radiation damage resistance of oxide nanocomposites by controlling the termination chemistry at the interface. The research is described in a paper published in the journal Nature Communications.

View LANL Data Science Press Release

September 22, 2014 — Linguists receive $260,000 grant to study endangered Irish language

 Studying the endangered Irish tongue—literally
Screen shot of ultrasound of tongue of native Irish speaker
UCSC 9/22/2014—Even though the Irish language is an official language of Ireland and has considerable government support, it is highly endangered: only 1.5% to 3% of the population regularly uses it in their communities and its future is in doubt. One unusual feature of the Irish language is that every consonant comes in two varieties—one where the tongue is raised and pushed forward, and one where it is raised and retracted. So, one important goal of researchers is to document that contrast—using ultrasonic real-time tongue imaging to non-invasively capture video of the tongue’s surface while it moves during speech. UC Santa Cruz has been awarded a $261,255 grant from the National Science Foundation to undertake a new project titled “Collaborative Research: An Ultrasound Investigation of Irish Palatalization” to take the ultrasound machine to Ireland, record native speakers of Irish in three major dialect regions of Irish that are isolated from one another, and to analyze the data. Analysis of the ultrasound data will also allow them to answer more general questions about speech production.

View UCSC Data Science Press Release

September 22, 2014 — Lawrence Livermore renews pact with Georgetown University to expand research and education in science and policy

LLNL, Georgetown renew data science/analytics pact
LLNL Director Bill Goldstein and Georgetown President John DeGioia sign a memorandum of understanding on Friday, renewing the LLNL-Georgetown partnership and expanding areas of collaboration. Credit: Georgetown University
LLNL 9/22/2014—Lawrence Livermore National Laboratory (LLNL) Director Bill Goldstein and Georgetown University President John DeGioia on Friday renewed their institutional commitment by signing a memorandum of understanding for an additional five years to expand the collaborative work in the areas of cyber security, biosecurity, nonproliferation and global climate, energy and environmental sciences. This renewal represents a significant expansion of an MOU originally signed in December 2009 and is a framework to broaden LLNL collaborations university-wide, including the Georgetown University Medical Center. The new MOU expands the fields of study to include data science and data analytics; bio-security; emergency and disaster management; global climate, energy and environment; food safety and security; and biotechnology (including such fields as infectious diseases, drug discovery, regenerative medicine, and urban resilience). In data analytics, projects include a potential collaboration on a new master’s degree program as well as working together to create a shared computational infrastructure leveraging LLNL’s high-performance computing capabilities.

View LLNL Data Science Press Release

September 19, 2014 — Project launched to study evolutionary history of fungi

Bread mold’s half-billion-year family tree
Jason Stajich is an associate professor of plant pathology and microbiology at UC Riverside. Credit: Stajich Lab/UC Riverside
UCR 9/19/2014—UC Riverside is one of 11 collaborating institutions that have received a total of $2.5 million from the National Science Foundation for a four-year project called the Zygomycete Genealogy of Life (ZyGoLife) focused on studying zygomycetes: ancient lineages of fungi used in numerous industrial processes and fermentation of foods. Thought to be among the first terrestrial fungi, zygomycetes represent one of the earliest origins of multicellular growth forms. Indeed, symbiotic associations with zygomycetes may have facilitated the origin of land plants. Their filamentous growth is in the form of the tube-like cell growth that characterizes species of fungi including bread and fruit molds, animal and human pathogens, and decomposers of a wide variety of organic compounds. Jason Stajich, associate professor of plant pathology and microbiology, is principal investigator. His UCR lab will be spearheading genome sequence analysis to better establish the family tree of fungi from these lineages and disseminating data into genomic databases such as MycoCosm of the Joint Genome Institute of the U.S. Department of Energy and FungiDB. The Stajich lab will also host visiting students and postdocs from the other teams to provide training in bioinformatics and evolutionary genomics in fungi.

View UCR Data Science Press Release

September 19, 2014 — Library taps specialist to explore role of technology in humanities research

 Digital humanities scholarship
Rachel Deblinger
UCSC 9/19/2014—The UC Santa Cruz University Library and Humanities Division have jointly awarded a two-year Council on Library and Information Resources (CLIR) Postdoctoral Fellowship supporting digital humanities scholarship to Rachel Deblinger. As a CLIR Digital Humanities Specialist, Deblinger will have the opportunity to build a community around digital humanities scholarship at a time when the practice is emerging at UCSC. Collaborating with librarians, faculty and students across multiple divisions, Deblinger will explore online collaborative research practices supporting digital humanities and develop a pilot infrastructure to support this research. She will also examine the role of the University Library supporting digital humanities, conduct workshops, and help to facilitate graduate research. Previously at UCLA, she served as technical consultant on the development of a textual database supporting the publication of Sephardic Lives: A Documentary History of the Ottoman Judeo-Spanish World and its Diaspora, 1700-1950; and was the thematic expert at the UCLA Center for Digital Humanities for the computational visualization of Shoah Foundation Holocaust testimonies.

View UCSC Data Science Press Release

September 16, 2014 — The Exxon Valdez — 25 years later

Data-mining the 1989 Exxon Valdez oil spill
Oil coated the rocky shoreline after the Exxon Valdez ran aground, leaking 10 to 11 million gallons of crude oil into the Gulf of Alaska on March 24, 1989. Credit: Alaska Resources Library & Information Services
UCSB 9/16/2014—UC Santa Barbara’s National Center for Ecological Analysis and Synthesis (NCEAS) has collaborated with investigators from Gulf Watch Alaska and the Herring Research and Monitoring Program to collate historical data from a quarter-century of monitoring studies on physical and biological systems altered by the 1989 Exxon Valdez oil spill. Now, two new NCEAS working groups will synthesize this and related data and conduct a holistic analysis to answer pressing questions about the interaction between the oil spill and larger drivers such as broad cycles in ocean currents and water temperatures. Both statistical and modeling approaches will be used to understand both mechanisms of change and the changes themselves, and to create an overview of past changes and potential futures for the entire area. The investigators will use time series modeling approaches to determine the forces driving variability over time in these diverse datasets. They will also examine the influences of multiple drivers, including climate forcing, species interactions and fishing. By evaluating species’ life history attributes, such as longevity and location, and linking them to how and when each species was impacted by the spill, the researchers may help predict ecosystem responses to other disasters and develop monitoring strategies to target vulnerable species before disasters occur.

View UCSB Data Science Press Release

September 16, 2014 — Human faces are so variable because we evolved to look unique

Why we all don’t look alike?
UCB 9/16/2014— The amazing variety of human faces—far greater than that of most other animals—is the result of evolutionary pressure to make each of us unique and easily recognizable, according to a new study by UC Berkeley scientists. They were able to assess human facial variability thanks to a U.S. Army database of body measurements compiled from male and female personnel in 1988. The researchers found that facial traits are much more variable than other bodily traits, such as the length of the hand. They also had access to data collected by the 1000 Genome project, which has sequenced more than 1,000 human genomes since 2008 and catalogued nearly 40 million genetic variations among humans worldwide. Looking at regions of the human genome that have been identified as determining the shape of the face, they found a much higher number of variants than for traits (such as height) not involving the face. They also compared the human genomes with recently sequenced genomes of Neanderthals and Denisovans and found similar genetic variation, which indicates that the facial variation in modern humans must have originated prior to the split between these different lineages. The study appeared Sept. 16 in the online journal Nature Communications.

View UCB Data Science Press Release

September 15, 2014 — 2014 Berkeley-Rupp Prize for boosting women in architecture, sustainability announced

 Prize for sustainable, soft architecture
KVA Matx Soft Rockers night time illumination and gathering. Credit: Phil Seaton/Living Photo
UCB 9/15/2014—Sheila Kennedy, an internationally recognized architect, innovator and educator is the 2014 recipient of the Berkeley-Rupp Architecture Professorship and Prize. Awarded every two years, the Berkeley-Rupp Prize of $100,000 is given by UC Berkeley’s College of Environmental Design to a distinguished design practitioner or academic who has made a significant contribution to advance gender equity in the field of architecture, and whose work emphasizes a commitment to sustainability and community. As part of her research, Kennedy will partner with non-governmental organizations (NGOs) to engage communities of fabricators in three developing regions around the world. She will lead UC Berkeley students in computation, architectural design, engineering, and city planning in a series of hands-on design workshops exploring new urban infrastructure. Using soft materials—from paper to wood to bio-plastic—the group will develop open-source digital-fabrication techniques and create adaptable prototypes such as pop-up solar streetlights, soft refrigeration kits for bicycle vendors, and public benches that collect and clean fresh water.

View UCB Data Science Press Release

September 15, 2014 — Collaboration drives achievement in protein structure research

Blueprint: how bacteria remember viruses
Thomas Terwilliger
LANL 9/15/2014—Computational analysis is key to structural understanding of a molecular machine that targets viral DNA. Researchers at Montana State University have provided the first blueprint of a bacterium’s “molecular machinery,” showing how bacterial immune systems fight off the viruses that infect them. By tracking down how bacterial defense systems work, the scientists can potentially fight infectious diseases and genetic disorders. The key is a repetitive piece of DNA in the bacterial genome called a CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats). The bacterial genome uses the CRISPR to capture and “remember” the identity of an attacking virus. Now the scientists have created programmable molecular scissors, called nucleases, that are being exploited for precisely altering the DNA sequence of almost any cell type of interest. LANL—along with partners Lawrence Berkeley National Laboratory and Duke and Cambridge University—developed software to analyze the protein structure of the nuclease, which was key to understanding its function. The researchers reported their findings in the journal Science.

View LANL Data Science Press Release

September 11, 2014 — Changing how we farm can save evolutionary diversity, study suggests

 Rescue wildlife diversity: diversify agriculture
Diversified farms, such as this coffee plantation in Costa Rica, house substantial phylogenetic diversity. Credit: Daniel Karp
UCB 9/11/2014—A new study by biologists at Stanford University and UC Berkeley highlights the dramatic hit on the evolutionary diversity of wildlife when forests are transformed into agricultural lands. The researchers counted some 120,000 birds of nearly 500 species in three types of habitat in Costa Rica, and calculated the birds’ phylogenetic diversity, a measure of the evolutionary history embodied in wildlife. The study, published in the Sept. 12 issue of Science, found that the phylogenetic diversity of the birds fared worst in habitats characterized by intensive farmlands consisting of single crops. Such intensive monocultures supported 900 million fewer years of evolutionary history, on average, compared with untouched forest reserves. The researchers found a middle ground in diversified agriculture, or farmlands with multiple crops adjoined by small patches of forest. Such landscapes supported on average 600 million more years of evolutionary history than the single crop farms. This work is urgent, the authors say, because humanity is driving about half of all known life to extinction—including many species that play key roles in Earth’s life-support systems—mostly through agricultural activities to support our vast numbers and meat-rich diets.
UC Berkeley release
Stanford release:

September 11, 2014 — Our microbes are a rich source of drugs, UCSF researchers discover

The microbial drug factories living within you
UCSF 9/11/2014—Bacteria that normally live in and upon us have genetic blueprints that enable them to make thousands of molecules that act like drugs, and some of these molecules might serve as the basis for new human therapeutics, according to UC San Francisco researchers in a study published in the Sept. 11 issue of the journal Cell. Microbiomes—ecosystems made up of many microbial species—are found in the gut, skin, nasal passages, mouth and vagina. Scientists have started to identify microbiomes in which species diversity and abundance differ from the normal range in ways that are associated with disease risks. By developing new data-analysis software and putting it to work on an extensive genetic database developed from human-associated bacterial samples collected as part of the National Institutes of Health’s ongoing Human Microbiome Project, the UCSF lab team identified clusters of bacterial genes that are switched-on in a coordinated way to guide the production of molecules that are biologically active in humans.

View UCSF Data Science Press Release

September 11, 2014 — Teaching computers the nuances of human conversation

SWF seeks companion robot for conversation, love
Marilyn Walker. Credit: C. Lagattuta
UCSC 9/11/2014—Natural language processing is now so good that the failure of automated voice-recognition systems to respond in a natural way has become glaringly obvious, according to Marilyn Walker, UC Santa Cruz professor of computer science. One of Walker’s current projects, funded last year by the National Science Foundation, involves analyzing posts from online debate forums to see how people present facts to support arguments. By annotating the online posts to uncover patterns in word choice and sentence construction, researchers seek to build a program that can identify sarcasm, report a poster’s stance on a topic, and identify the arguments and counter-arguments for a particular topic. Such software could also be useful as an educational tool: psychological evidence suggests that a debate becomes less polarized if people are exposed to multiple arguments. Also, by changing how computers talk, it may create an unspoken relationship that strengthens our connections to devices. Ultimately, such technology could be used to create companion robots, navigation programs, or restaurant recommendation software that interact with us more naturally.

View UCSC Data Science Press Release

September 10, 2014 — UCSF, Google Earth Engine making maps to predict malaria

Predicting malaria, targeting response
A sample risk map of malaria in Swaziland during the transmission season using data from 2011–2013.
UCSF 9/10/2014—UC San Francisco (UCSF) is working to create an online platform that health workers around the world can use to predict where malaria is likely to be transmitted using data on Google Earth Engine. The goal is to enable resource-poor countries to wage more targeted and effective campaigns against the mosquito-borne disease, which kills 600,000 people a year, most of them children. Google Earth Engine brings together the world’s satellite imagery—trillions of scientific measurements dating back almost 40 years—with data-mining tools for scientists, independent researchers and nations to detect changes, map trends and quantify differences on the Earth’s surface. Local health workers will be able to upload their own data on where and when malaria cases have been occurring and combine it with real-time satellite data on weather and other environmental conditions within Earth Engine to pinpoint where new cases are most likely to occur. That way, they can spray insecticide, distribute bed nets or give antimalarial drugs just to the people who still need them, instead of blanketing the entire country. By looking at the relationship between disease occurrence and factors such as rainfall, vegetation and the presence of water in the environment, the maps will also help health workers and scientists study what drives malaria transmission. The tool could also be adapted to predict other infectious diseases.

View UCSF Data Science Press Release

September 10, 2014 — Advanced Light Source sets microscopy record

Ready for the nanoscope?
Ptychographic image using soft X-rays of lithium iron phosphate nanocrystal after partial dilithiation. The delithiated region is shown in red.
LBNL 9/10/2014—A record-setting X-ray microscopy experiment may have ushered in a new era for nanoscale imaging. Working at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (LBNL), a collaboration of researchers used low energy or “soft” X-rays to image structures only five nanometers in size. This resolution, obtained at LBNL’s Advanced Light Source (ALS) is the highest ever achieved with X-ray microscopy. Using ptychography, a coherent diffractive imaging technique based on high-performance scanning transmission X-ray microscopy (STXM), the collaboration was able to map the chemical composition of lithium iron phosphate nanocrystals after partial dilithiation. The results yielded important new insights into a material of high interest for electrochemical energy storage. Key to the success of Shapiro and his collaborators were the use of soft X-rays which have wavelengths ranging between 1 to 10 nanometers, and a special algorithm that eliminated the effect of all incoherent background signals. The findings were published in the journal Nature Photonics.

View LBNL Data Science Press Release

September 9, 2014 — The search for Ebola immune response targets

 Wanted: Ebola immune response data
SDSC 9/9/2014—The effort to develop therapeutics and a vaccine against the deadly Ebola virus disease (EVD) requires a complex understanding of the microorganism and its relationship within the host, especially the immune response. Adding to the challenge, EVD can be caused by any one of five known species within the genus Ebolavirus (EBOV), in the Filovirus family. Now, researchers at the La Jolla Institute for Allergy and Immunology (La Jolla Institute) and the San Diego Supercomputer Center (SDSC) at UC San Diego are assisting the scientific community by running high-speed online publications of analysis of data in the Immune Epitope Data Base (IEDB), and predicting epitopes using the IEDB Analysis Resource.

View SDSC Data Science Press Release

September 9, 2014 — SDSC joins the Intel Parallel Computing Centers Program

SDSC 9/9/2014—The San Diego Supercomputer Center (SDSC) at UC San Diego is working with semiconductor chipmaker Intel Corporation to further optimize research software to improve the parallelism, efficiency, and scalability of widely used molecular and neurological simulation technologies. The collaboration is part of the Intel Parallel Computing Centers program, which provides funding to universities, institutions, and research labs to modernize key community codes used across a wide range of disciplines to run on current industry-standard parallel architectures. SDSC researchers are expanding the Intel relationship to cover additional research areas and software packages, including improving the performance of simulations on manycore architectures, to allow researchers to study chemical reactions directly, without severe approximations. With President Obama’s announcement of the BRAIN initiative in April 2013, many are predicting computational neuroscience will have a scientific impact to rival what computational genomics had during the last decade.

View SDSC Data Science Press Release

September 9, 2014 — Simulating the south Napa earthquake

 Simulating the south Napa earthquake
Researchers are using data from the South Napa earthquake to help validate models. These figures compare the observed intensity of ground motion from seismic stations of the California Integrated Seismic Network “ShakeMap” (left) with the simulated shaking (peak ground velocity) from SW4 simulations (right).
LLNL 9/9/2014—A Lawrence Livermore seismologist Artie Rodgers is tapping into LLNL's supercomputers to simulate the detailed ground motion of August’s magnitude 6.0 south Napa earthquake, the largest to hit the San Francisco Bay Area since the magnitude 6.9 Loma Prieta event in 1989. Using descriptions of the earthquake source from the UC Berkeley Seismological Laboratory, Rodgers is determining how the details of the rupture process and 3D geologic structure, including the surface topography, may have impacted the ground motion. Seismic simulations allow scientists to better understand the distribution of shaking and damage that can accompany earthquakes, including possible future “scenario” earthquakes. Simulations are only as valid as the elements going into them; thus the recent earthquake provides data to validate methods and models.

View LLNL Data Science Press Release

September 8, 2014 — Los Alamos conducts important hydrodynamic experiment in Nevada

 Surrogate nuclear materials mimic weapons
Technicians at the Nevada National Security Site make final adjustments to the "Leda" experimental vessel in the "Zero Room" at the underground U1a facility.
LANL 9/8/14—On August 12, 2014, Los Alamos National Laboratory (LANL) successfully fired the latest in a series of experiments at the Nevada National Security Site (NNSS). The experiment, called Leda, consisted of a plutonium surrogate material and high explosives to implode what weapon physicists call a “weapon-relevant geometry.” Hydrodynamic experiments such as Leda involve surrogate non-nuclear materials that mimic many of the properties of nuclear materials. Hydrodynamics refers to the fact that solids, under extreme conditions, begin to mix and flow like liquids. Scientists will now study the data in detail and compare with pre-shot predictions. The resulting findings will help physicists assess their ability to predict weapon-relevant physics and model weapon performance in the absence of full-scale underground nuclear testing.

View LANL Data Science Press Release

September 5, 2014 — When good software goes bad

When good software goes bad
This so-called Blue Screen of Death is often the result of errors in software. Credit: Courtesy Image
UCSB 9/5/2014—With computing distributed across multiple machines on the cloud, errors and glitches are not easily detected before software is rolled out to the public. As a result, bugs manifest themselves after the programs have been downloaded, costing a software company time, money and even user confidence, and it can leave devices vulnerable to security breaches. With a grant of nearly $500,000 from the National Science Foundation, UC Santa Barbara computer scientist Tevfik Bultan and his team are studying verification techniques that can catch and repair bugs in code that manipulates and updates data in web-based software applications. Using techniques that translate software data into code that can be evaluated with mathematical logic, Bultan’s team can verify the soundness of any particular software. By automating the process and adding steps to update the software as needed, crashes, perpetuated errors, vulnerabilities and other glitches will take up less time and money.

View UCSB Data Science Press Release

September 4, 2014 — A Q&A on the future of digital health

Real health data differ from controlled trials
Michael Blum, director of the Center for Digital Health Innovation, speaks at a recent conference.
UCSF 9/4/2014—We recently sat down with Michael Blum, director of UCSF’s Center for Digital Health Innovation, to talk about the future of health wearables and what more accurate health data could teach us about improving patient care. Among the topics he brought up: One of the first things we are going to find is that a lot of real world health data and information about the general public looks very different than it did in highly controlled clinical trials. When we start to get these very large, detailed real-world data sets we may be very surprised to see the answers.

View UCSF Data Science Press Release

September 2, 2014 — Sierra Nevada freshwater runoff could drop 26 percent by 2100, UC study finds

Warm climate = thirsty plants = less water runoff
The Sierra Nevada snowpack runoff will diminish as a warmer climate encourages more plant growth at higher temperatures, a UC Irvine and UC Merced study has determined. Credit: Matt Meadows /UC Merced
UCI/UCM 9/2/2014—By 2100, communities dependent on freshwater from mountain-fed rivers could see significantly less water, according to a new climate model recently released by researchers at UC Irvine and UC Merced. As the climate warms, higher elevations that are usually snow-dominated see milder temperatures; plants that normally go dormant during the winter snows remaining active longer, absorbing and evaporating more water, reducing projected runoff. Using water-vapor emission rates and remote-sensing data, the authors determined relationships between elevation, climate and envirotranspiration. Greater vegetation density at higher elevations in the Kings basin with the 4.1 degrees Celsius warming projected by climate models for 2100 could boost basin evapotranspiration by as much as 28 percent, with a corresponding 26 percent decrease in river flow. The study findings appear in Proceedings of the National Academy of Sciences. Scientists have recognized for a while that something like this was possible, but no one had been able to quantify whether it could be an effect big enough to concern California water managers.

UCM release
UCI Release

September 5, 2014 — New catalyst converts CO₂ to fuel

New catalyst converts CO₂ to fuel
Structure of 2D molybdenum disulfide. Molybdenum atoms are shown in teal, sulfur atoms in yellow. Credit: Wang et al, Massachusetts Institute of Technology
LBNL/NERSC 9/5/2014—Scientists from the University of Illinois at Chicago have synthesized a catalyst that improves their system for converting waste carbon dioxide (CO₂) into syngas, a precursor of gasoline and other energy-rich products, bringing the process closer to commercial viability. The unique two-step catalytic process uses molybdenum disulfide and an ionic liquid to transfer electrons to CO₂ in a chemical reaction. The new catalyst improves efficiency and lowers cost by replacing expensive metals like gold or silver in the reduction reaction, directly reducing CO₂ to syngas (a mixture of carbon monoxide plus hydrogen) without the need for a secondary, expensive gasification process. Supercomputing resources at the U.S. Department of Energy’s National Energy Research Scientific Computing Center (NERSC) helped the research team confirm their experimental findings.

View LBNL,NERSC Data Science Press Release

September 2, 2014 — Truly secure e-commerce: quantum crypto-keys

Secure computing for the ‘Everyman’
This small device developed at Los Alamos National Laboratory uses the truly random spin of light particles as defined by laws of quantum mechanics to generate a random number for use in a cryptographic key that can be used to securely transmit information between two parties.
LANL 9/2/2014—The largest information technology agreement ever signed by Los Alamos National Laboratory (LANL) brings the potential for truly secure data encryption to the marketplace after nearly 20 years of development. By harnessing the quantum properties of light for generating random numbers, and creating cryptographic keys with lightning speed, the technology enables a completely new commercial platform for real-time encryption at high data rates. If implemented on a wide scale, quantum key distribution technology could ensure truly secure commerce, banking, communications and data transfer. The Los Alamos technology is simple and compact enough that it could be made into a unit comparable to a computer thumb drive or compact data-card reader. Units could be manufactured at extremely low cost, putting them within easy retail range of ordinary electronics consumers.

View LANL Data Science Press Release

September 3, 2014 — UCLA-led consortium to focus on developing a new architecture for the Internet

Internet of tomorrow: TCP/PI, make way for NDN
UCLA 9/3/2014—Launching a critical new phase in developing the Internet of the future, UCLA will host a consortium of universities and leading technology companies to promote the development and adoption of Named Data Networking (NDN). NDN is an emerging Internet architecture that promises to increase network security, accommodate growing bandwidth requirements and simplify the creation of increasingly sophisticated applications. The NDN team’s goal is to build a replacement for Transmission Control Protocol/Internet Protocol (TCP/IP), the current underlying approach to all communication over the Internet. NDN leverages empirical evidence about what has worked on the Internet and what hasn’t, adapting to changes in usage over the past 30-plus years and simplifying the foundation for development of mobile platforms, smart cars and the Internet of Things—in which objects and devices are equipped with embedded software and are able to communicate with wireless digital networks.

View UCLA Data Science Press Release

September 2, 2014 — Genetic diversity of corn is declining in Mexico

Genetic diversity of corn is declining in Mexico
The new study contradicts some earlier and more optimistic assessments of corn diversity in Mexico. Credit: Thinkstock photo
UCD 9/2/2014—The genetic diversity of maize, or corn, is declining in Mexico, where the world’s largest food crop originated, report researchers in Mexico and at UC Davis. The findings are particularly sobering at a time when agriculturists around the world are looking to the gene pools of staple foods like corn to dramatically increase food production for a global population expected to top 9 billion by 2050. The new study, which contradicts some earlier and more optimistic assessments of corn diversity in Mexico, appears online in the Proceedings of the National Academy of Sciences. This study—the first to examine changes in maize diversity across Mexico—compares maize diversity estimates from 38 case studies over the past 15 years with data from farmers throughout Mexico. “The question of diversity finally can be answered for maize, thanks to a unique database gathered through this binational project,” said lead author George A. Dyer of El Colegio de México, in Mexico City.

View UCD Data Science Press Release

August 27, 2014 — Encyclopedia of how genomes function gets much bigger

How is a human like a roundworm, or fruit fly?
Berkeley Lab scientists contributed to an NHGRI effort that provides the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function. Credit: Darryl Leja, NHGRI
LBNL 8/27/2014—A big step in understanding the mysteries of the human genome was taken in three analyses that provide the most detailed comparison yet of how the genomes of the fruit fly, roundworm, and human function. The research, appearing August 28 in the journal Nature, compares how the information encoded in the three species’ genomes is “read out,” and how their DNA and proteins are organized into chromosomes. The results add billions of entries to a publicly available archive of functional genomic data. Scientists can use this resource to discover common features that apply to all organisms. These fundamental principles will likely offer insights into how the information in the human genome regulates development, and how it is responsible for diseases.

View LBNL Data Science Press Release

August 26, 2014 — Existing power plants will spew 300 billion more tons of carbon dioxide during use

Committed to global warming: 300 gigatons of CO2
A coal-burning power plant at the Turceni Power Station in Romania. Credit: Robert and Mihaela Vicol
UCI 8/26/2014—Existing power plants around the world will pump out more than 300 billion tons of carbon dioxide over their expected lifetimes, significantly adding to atmospheric levels of the climate-warming gas, according to UC Irvine and Princeton University scientists. Using a new mathematical technique called commitment accounting, their study is the first to quantify how quickly these “committed” emissions are growing—by about 4 percent per year—as more fossil fuel-burning power plants are built. “These facts are not well known in the energy policy community, where annual emissions receive far more attention than future emissions related to new capital investments,” the paper states. The study was published in the August 26 issue of the journal Environmental Research Letters.

View UCI Data Science Press Release

August 27, 2014 — National Biomedical Computation Resource receives $9 million NIH award

NIH awards $9 million for biomedical computation
National Biomedical Computation Resource director Rommie Amaro
SDSC 8/27/2014— The National Biomedical Computation Resource (NBCR) at UC San Diego has received $9 million in funding over five years from the National Institutes of Health (NIH) to allow NBCR to continue its work connecting biomedical scientists with supercomputing power and emerging information technologies. NBCR Director Rommie Amaro said that the renewed funding will make it possible for biomedical researchers to study phenomena from the molecular level to the level of the whole organ. Biomedical computation—which applies physical modeling and computer science to the field of biomedical sciences—is often a cheaper alternative to traditional experimental approaches and can speed the rate at which discoveries are made for host of human diseases and biological processes.

View SDSC Data Science Press Release

August 26, 2014 — Photon speedway puts big data In the fast lane

Lights on! Big Data in the fast lane
Scientists envision ESnet serving as a “photon science speedway” connecting Stanford’s advanced light sources with computing resources at Berkeley Lab.
LBNL/NERSC 8/26/2014—A series of experiments conducted by Lawrence Berkeley National Laboratory (LBNL) and SLAC National Accelerator Laboratory (SLAC) researchers and collaborators is shedding new light on photosynthesis, and demonstrating how light sources and supercomputing facilities can be linked via a “photon science speedway” called ESnet to better address emerging challenges in massive data analysis. Last year, LBNL and SLAC researchers led a protein crystallography experiment at SLAC’s Linac Coherent Light Source to look at the different photoexcited states of an assembly of large protein molecules that play a crucial role in photosynthesis. Subsequent analysis of the data on supercomputers at the Department of Energy’s (DOE’s) National Energy Research Scientific Computing Center (NERSC) helped explain how nature splits a water molecule during photosynthesis, a finding that could advance the development of artificial photosynthesis for clean, green and renewable energy.

View LBNL,NERSC Data Science Press Release

August 25, 2014 — New project is the ACME of addressing climate change

LBNL 8/25/2014—Eight Department of Energy national laboratories, including Lawrence Berkeley National Laboratory (LBNL), are combining forces with the National Center for Atmospheric Research, four academic institutions and one private-sector company in the new effort. Other participating national laboratories include Argonne, Brookhaven, Lawrence Livermore, Los Alamos, Oak Ridge, Pacific Northwest and Sandia. The project, called Accelerated Climate Modeling for Energy (ACME), is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications. Over a planned 10-year span, the project aims to conduct simulations and modeling on the most sophisticated HPC machines as they become available, i.e., 100-plus petaflop machines and eventually exascale supercomputers.

View LBNL Data Science Press Release

August 22, 2014 — NASA picks top Earth data challenge ideas, opens call for climate apps

Psst! Want $50,000? Design a big-data climate app
OpenNEX Challenge Phase II. Credit: NASA
NASA Ames 8/22/2014—NASA has selected four ideas from the public for innovative uses of climate projections and Earth-observing satellite data. The agency also has announced a follow-on challenge with awards of $50,000 to build climate applications based on OpenNEX data on the Amazon cloud computing platform. Both challenges use the Open NASA Earth Exchange, or OpenNEX, a data, cloud computing, and knowledge platform where users can share modeling and analysis codes, scientific results, information and expertise to solve big data challenges in the Earth sciences. OpenNEX provides users a large collection of climate and Earth science satellite data sets, including global land surface images, vegetation conditions, climate observations and climate projections.

View AMES Data Science Press Release

August 21, 2014 — SDSC mourns the loss of J. Freeman Gilbert

SDSC mourns the loss of J. Freeman Gilbert
J. Freeman Gilbert. Credit: Scripps Institution of Oceanography Archives, UC San Diego Library
SDSC 8/21/2014— In addition to being a Distinguished Professor of geophysics with the Scripps Institution of Oceanography (SIO) for more than 53 years, James Freeman Gilbert is being remembered by those at the San Diego Supercomputer Center (SDSC) for playing an integral role in securing a National Science Foundation (NSF) award to establish SDSC on the UC San Diego campus almost 30 years ago. A leading contributor in computational geophysics, seismology, earthquake sources, and geophysical inverse theory, Gilbert was the author of numerous research papers, book chapters, reviews, and other publications. He passed away on August 15, 2014 in Portland, Oregon, due to complications from a car accident. He was 83.

View SDSC Data Science Press Release

August 20, 2014 — Livermore researchers create engineered energy absorbing material

Soft or firm? Programming the cushioning of foam
A silicone cushion with programmable mechanical energy absorption properties was produced through a 3D printing process using a silicone-based ink by Lawrence Livermore National Laboratory researchers.
LLNL 8/20/2014—Solid gels and porous foams are used for padding and cushioning, but each has its own advantages and limitations. Gels are effective but are relatively heavy and their lack of porosity gives them a limited range of compression. Foams are lighter and more compressible, but their performance is not consistent due to the inability to accurately control the size, shape and placement of voids (air pockets) when the foam is manufactured. Now, with the advent of additive manufacturing—also known as 3D printing—a team of engineers and scientists at Lawrence Livermore National Laboratory (LLNL) has found a way to design and fabricate, at the microscale, new cushioning materials with a broad range of programmable properties. LLNL researchers constructed cushions using two different configurations, and were able to model and predict the performance of each of the architectures under both compression and shear. “The ability to dial in a predetermined set of behaviors across a material at this resolution is unique, and it offers industry a level of customization that has not been seen before,” said the lead author of a paper published in the August 20 issue of Advanced Functional Materials.

View LLNL Data Science Press Release

August 19, 2014 — New project is the ACME of addressing climate change

8 national labs+: ACME of climate change models
A massive crack runs about 29 kilometers (18 miles) across the Pine Island Glacier’s floating tongue, marking the moment of creation for a new iceberg that will span about 880 square kilometers (340 square miles) once it breaks loose from the glacier. The onset of the collapse of the Antarctic Ice Sheet is one area a new project headed by Lawrence Livermore will examine. Credit: Quantum Filmmakers
LLNL 8/19/14—High performance computing (HPC) will be used to develop and apply the most complete climate and Earth system model to address the most challenging and demanding climate change issues. Eight national laboratories, including Lawrence Livermore (LLNL), are combining forces with the National Center for Atmospheric Research, four academic institutions and one private-sector company in the new effort. Other participating national laboratories include Argonne, Brookhaven, Lawrence Berkeley (LBNL), Los Alamos (LANL), Oak Ridge, Pacific Northwest and Sandia. The project, called Accelerated Climate Modeling for Energy (ACME), is designed to accelerate the development and application of fully coupled, state-of-the-science Earth system models for scientific and energy applications. The plan is to exploit advanced software and new high performance computing machines as they become available. The initial focus will be on three climate change science drivers (the water cycle, biogeochemistry, and the cryosphere) and corresponding questions.

View LLNL Data Science Press Release

August 19, 2014 — California is deficit-spending its water

California is deficit-spending its water
Cumulative water right allocations relative to mean annual runoff, excluding water rights for hydropower generation
UCM 8/19/2014—California is deficit-spending its water and has been for a century, according to state data analyzed recently by researchers from the University of California. UC Merced professor Joshua Viers and postdoctoral researcher Ted Grantham, with UC Davis at the time, explored the state’s database of water-rights allocations, and found that allocations in California since 1914 exceed the state's actual water supply by five times the average annual runoff and 100 times the actual surface-water supply for some river basins. “We’re kind of in big trouble,” Viers said, considering the changing climate and the expectation that more frequent, multi-year droughts are the new normal. The analysis, entitled “100 years of California’s Water Rights System: Patterns, Trends and Uncertainty,” by Viers, the director of The Center for Information Technology Research in the Interest of Society (CITRIS) at UC Merced, and Grantham, now a scientist with the U.S. Geological Survey, was published in the August 19 issue of the journal Environmental Research Letters.

View UCM Data Science Press Release

August 19, 2014 — California has given away rights to far more water than it has

California allocates 5x more water than it has
On average, California allocates more than five times the amount of water to users than is available in its rivers and streams. Credit: Josh Viers/UC Merced
UCD 8/19/2014—California has allocated five times more surface water than the state actually has, making it hard for regulators to tell whose supplies should be cut during a drought, reported researchers from two UC campuses. The scientists said California’s water-rights regulator, the State Water Resources Control Board, needs a systematic overhaul of policies and procedures to bridge the gaping disparity, but lacks the legislative authority and funding to do so. Ted Grantham, who conducted the analysis with UC Merced professor Joshua Viers and who explored the state’s water-rights database as a postdoctoral researcher with the UC Davis Center for Watershed Sciences, said the time is ripe for tightening the water-use accounting. Viers and Grantham, now with the U.S. Geological Survey, are working to iron out issues with its database and make the information available to policymakers.

View UCD Data Science Press Release

August 18, 2014 — SPOT Suite transforms beamline science

SPOT Suite transforms beamline science
Advanced Light Source (ALS) at Berkeley Lab. Credit: Roy Kaltschmidt
LBNL/NERSC 8/18/2014—For decades, synchrotron light sources such as the DOE’s Advanced Light Source (ALS) at LBNL have been operating on a manual grab-and-go data management model—users travel thousands of miles to run experiments at the football-field-size facilities, download raw data to an external hard drive, then process and analyze the data on their personal computers, often days later. But the deluge of data brought on by faster detectors and brighter light sources is quickly making this practice implausible. Fortunately, ALS X-ray scientists, facility users, computer and computational scientists from the Computational Research Division (CRD) and National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory (LBNL) recognized this developing situation years ago and teamed up to create new tools for reducing, managing, analyzing and visualizing beamline data. The result of this collaboration is SPOT Suite, which is already transforming the way scientists run their experiments at the ALS. The goal is for beamline scientists to be able to access computational resources without having to become computer experts.

View LBNL,NERSC Data Science Press Release

August 14, 2014 — New tool makes a single picture worth a thousand — and more — images

A picture is worth 1,000 pictures
Researchers harness big visual data by creating an average image from thousands of photos. Video by Jun-Yan Zhu, Yong Jae Lee and Alexei Efros, UC Berkeley
UCB 8/14/2014—New software developed by UC Berkeley (UCB) computer scientists seeks to tame the vast amount of visual data in the world by generating a single photo that can represent massive clusters of images. It works by generating an image that literally averages the key features of the other photos. Users can also give extra weight to specific features to create subcategories and quickly sort the image results. “Visual data is among the biggest of Big Data,” said Alexei Efros, UCB associate professor of electrical engineering and member of the UCB Visual Computing Lab. “We have this enormous collection of images on the Web, but much of it remains unseen by humans because it is so vast. People have called it the dark matter of the Internet. We wanted to figure out a way to quickly visualize this data by systematically ‘averaging’ the images.” Efros worked with Jun-Yan Zhu, UCB computer science graduate student and the paper’s lead author, and Yong Jae Lee, former UCB postdoctoral researcher, to develop the system, which they have dubbed AverageExplorer.

View UCB Data Science Press Release

August 14, 2014 — State of the lake: High (tech) and dry at Tahoe

New data network monitors health of Lake Tahoe
A monitoring station at Lake Tahoe, California. Credit: Gregory Urquiaga/UC Davis
UCD 8/14/2014—46 years after UC Davis first began continuous monitoring of Lake Tahoe, an array of new technologies and computer models are helping scientists better understand what has proven to be a complex ecosystem. The complexities are examined in a report “Tahoe: State of the Lake Report 2014,” released August 14 by the Tahoe Environmental Research Center at UC Davis. The report explains how drought, climate change, and other natural and human factors are driving changes at Lake Tahoe. By combining a half-century of data collection at Lake Tahoe with climate-change forecasts, scientists found that summerlike conditions have been extended at Lake Tahoe. By the end of this century, summer may be two months longer than it was in the 1960s, and maximum temperatures may have risen by 8 degrees Fahrenheit. The report also describes the new Real-time Nearshore Water Quality Network of about 20 monitoring stations—the first six of which were installed in August—to gather minute-by-minute data about changing water quality conditions to explore what is causing degradation to Lake Tahoe’s nearshore environment.

View UCD Data Science Press Release

August 13, 2014 — UCLA professor develops digital resources for study of ancient Egypt

Digital resources for exploring ancient Egypt
Willeke Wendrich stands with a headless statue of Ramses at Karanis, an archaeological site in the Fayum Oasis, located 62 miles southwest of Cairo. She has a long-running excavation and site-management project there. Credit: Fatma Faroux
UCLA 8/13/2014—Willeke Wendrich, a faculty member in UCLA’s Department of Near Eastern Languages and Cultures, is spearheading several digital humanities projects centered on Egypt. Helping her achieve her research goals have been the consultants and technologists of UCLA’s Institute for Digital Research and Education (IDRE). Realizing the need for an online encyclopedia for archaeologists, Wendrich applied for a National Endowment for the Humanities (NEH) grant to launch the UCLA Encyclopedia of Egyptology. Wendrich is also immersed in The Digital Karnak project, which enables users to virtually explore one of the world’s most expansive temple complexes in Egypt. Wendrich chairs the editorial board of the Cotsen Institute of Archaeology Press and serves on the executive committee for IDRE’s Humanities, Arts, Architecture, Social and Information Collaborative.

View UCLA Data Science Press Release

August 13, 2014 — Statistical model predicts performance of hybrid rice

Genomic prediction: pilot study on hybrid rice
Long-grain rice. Credit: Keith Weller, USDA
UCR 8/13/2014—Genomic prediction, a new field of quantitative genetics, is a statistical approach to predicting the value of an economically important trait in a plant, such as yield or disease resistance. The method works if the trait is heritable, as many traits tend to be, and can be performed early in the life cycle of the plant, helping reduce costs. Now a research team led by plant geneticists at UC Riverside and Huazhong Agricultural University, China, has used the method to predict the performance of hybrid rice (for example, the yield, growth-rate and disease resistance). The new technology could potentially revolutionize hybrid breeding in agriculture. The study, published online in the August 26 issue of the Proceedings of the National Academy of Sciences, is a pilot research project on rice. The technology can be easily extended, however, to other crops such as maize.

View UCR Data Science Press Release

August 12, 2014 — Longtime UCLA professor earns highest honor in applied mathematics

UCLA’s Osher awarded prestigious Gauss Prize
UCLA professor Stanley Osher. Credit: Christelle Nahas/UCLA Newsroom
UCLA 8/12/2014—Stanley Osher, UC Los Angeles professor of mathematics and former director of applied mathematics, is the third person ever to be awarded the prestigious Gauss Prize, the highest honor in applied mathematics. Osher has collaborated with colleagues in a wide range of fields and the mathematical techniques he has pioneered have been highly influential. The results of his research have improved MRI scans and medical image analysis, advanced computer chip design, helped law enforcement agencies combat crime, enhanced computer vision, provided new ways to forecast weather and identify the source of earthquakes, and even revolutionized computer modeling for the design of supersonic jets.

View UCLA Data Science Press Release

August 7, 2014 — Cancer study reveals powerful new system for classifying tumors

Powerful new system for classifying cancers
This diagram illustrates how tumors with different tissues of origin were reclassified on the basis of molecular analyses. Credit: Zhong Chen, NIH/NIDCD
UCSC 8/7/2014—Cancers are classified primarily on the basis of where in the body the disease originates, as in lung cancer or breast cancer. According to a new study, however, one in ten cancer patients would be classified differently using a new classification system based on molecular subtypes instead of the current tissue-of-origin system. This reclassification could lead to different therapeutic options for those patients, scientists reported in a paper published August 7 in Cell. The study involved an enormous amount of molecular and clinical data, which was managed by a software developer at UC Santa Cruz. UCSC researchers worked with the bioinformatics company Sage Bionetworks to create Synapse as a data repository for the Pan-Cancer Initiative. The data sets and results have been made available to other researchers through the Synapse web site.

View UCSC Data Science Press Release

August 8, 2014 — Cancer categories recast in largest-ever genomic study

 Largest-ever genomic study recasts cancers
UCSF 8/7/2014—New research partly led by UC San Francisco-affiliated scientists suggests that one in 10 cancer patients would be more accurately diagnosed if their tumors were defined by cellular and molecular criteria rather than by the tissues in which they originated. Such information, in turn, could lead to more appropriate treatments. In the largest study of its kind to date, scientists analyzed molecular and genetic characteristics of more than 3,500 tumor samples of 12 different cancer types using multiple genomic technology platforms.

View UCSF Data Science Press Release

August 4, 2014 — Global economic losses from cyclones linger for decades, study finds

Hurricanes depress economies for decades
A new study indicates that cyclones over the last 60 years have slowed the annual growth rate of the global domestic product by about 1.3 percent.
UCB 8/4/2014—Economic losses due to hurricanes worldwide continue for decades after disastrous storms strike; moreover, the losses are not alleviated by spending on reconstruction, and may climb with storms that are intensified by climate change. These are among the key findings of a new study that represents the first comprehensive history of global hurricane exposure, analyzing the economic impact of cyclones in painstaking detail. The two coauthors—one a UC Berkeley economist and public policy professor—amassed data with all countries’ exposure to cyclones based on ground-, ship- and satellite-based meteorological observations, and used a physical model to recreate the wind intensity at every point on the planet’s surface for 6,700 cyclones since 1950. Next, they combined this huge dataset with economic datasets for statistical analysis. The study debunks the popularly held belief that environmental disasters disrupt business only for a few months, or at most for a couple of years, and have no long-term effects. On the contrary, when a cyclone hits, in a single day it can undo years of economic development, and reduces per capita incomes by up to 7.4 percent two decades later—akin to rewinding an economy 3.7 years. The findings are laid out in “The Causal Effect of Environmental Catastrophe on Long-Run Economic Growth: Evidence From 6,700 Cyclones,” a working paper released Aug. 4 by the National Bureau of Economic Research (NBER).

View UCB Data Science Press Release

August 4, 2014 — erkeley to host international neuroscience database to speed brain discoveries

UCB/Kavli Foundation 8/4/2014—UC Berkeley, a partner in “Neurodata Without Borders,” will host a neuroscience database to make the digital information more usable and accessible and accelerate the pace of discoveries about the brain in health and disease. The Allen Institute for Brain Science, California Institute of Technology, New York University School of Medicine, the Howard Hughes Medical Institute (HHMI), and UC Berkeley are collaborating on a project aimed at making databases about the brain more useable and accessible for neuroscientists – a step seen as critical to accelerating the pace of discoveries about the brain in health and disease. With funding from GE, The Kavli Foundation, the Allen Institute for Brain Science, the HHMI, and the International Neuroinformatics Coordinating Facility (INCF), the year-long project will focus on standardizing a subset of neuroscience data, making this research simpler for scientists to share.

View UCB Data Science Press Release

August 4, 2014 — The best of both worlds

The best of both worlds
Stefano Tessaro. Credit: Sonia Fernandez
UCSB 8/04/2014—Security standardization is a double-edged sword. An encryption algorithm that gets recognized by an authority such as the National Institute of Standards and Technology (NIST) will be put into wide use, even embedded into chips that are built into computers. That’s great for efficiency and reliability—but if there’s a successful attack, the vast majority of the world’s electronic communications are suddenly vulnerable to decryption and hacking. Moreover, the cost of security is speed: the most secure cryptographic algorithms are not the fastest. Funded by a $500,000 grant from the National Science Foundation’s Secure and Trustworthy Cyberspace program, UC Santa Barbara cryptologist Stefano Tessaro and his team will study what it would take to devise algorithms researchers know to be secure while maintaining the level of service (i.e. speed) internet users have come to expect.

View UCSB Data Science Press Release

July 31, 2014 — UCLA Engineering to lead new NSF-funded cybersecurity research center

 Embrace obfuscation
Amit Sahai, director of the Center for Encrypted Functionalities. Credit: UCLA Engineering
UCLA 7/31/2014—The UCLA Henry Samueli School of Engineering and Applied Science is leading a new multi-institution research center on cybersecurity that is funded by a five-year $5 million grant from the National Science Foundation. The Center for Encrypted Functionalities, which opened July 31, will advance the study of a technique called program obfuscation: the use of new encryption methods to make a computer program, and not just its output, invisible to an outside observer, while preserving its functionality, or the way it works. The center—a collaboration among researchers at UCLA, Stanford University, Columbia University, the University of Texas at Austin, and Johns Hopkins University—is headed by Amit Sahai, UCLA professor of computer science. Last year, Sahai and colleagues devised the first mathematically sound approach to encrypting functionalities, a breakthrough that could reshape the way we think about security and computation. Their innovative approach uses a “multilinear jigsaw puzzle” approach, so an unauthorized user trying to find out how a protected piece of software worked would find only nonsensical jumbles of numbers.

View UCLA Data Science Press Release

July 31, 2014 — Global Alliance for Genomics and Health unveils new genomics interface

New genomics interface
David Haussler directs the UC Santa Cruz Genomics Institute. Credit: R. R. Jones
UCSC 7/31/2014— The Global Alliance for Genomics and Health, founded in 2014, has released a new application programming interface (API) developed by the Global Alliance’s Data Working Group that will allow DNA data providers and consumers to better share information and work together on a global scale. David Haussler, professor of biomolecular engineering at UC Santa Cruz, is co-chair of the Data Working Group and a cofounder of the Alliance. The new open-source Genomics API, referred to as Version 0.5, is a standard, open tool promoting data interoperability, to allow the wider bioinformatics community to participate. “The Global Alliance is breaking new ground in combining genomic sequencing and clinical care,” said Matt Wood, general manager of data science for Amazon Web Services. “We view these new APIs as a vital component for collaboration and development of next-generation tools that can run cost-effectively at massive scale.”

View UCSC Data Science Press Release

July 30, 2014 — Gene behind rare birth abnormality Is a window on evolution

Genetic birth defect is a window onto evolution
UCSF 7/30/2014—A developmental protein called ectodysplasin, if defectively encoded in a key gene, causes a rare human birth defect: a shortage or absence of sweat glands, misshapen and absent teeth, and loss of hair follicles—all appendages that develop from the same embryonic tissue. Previously, researchers in Switzerland had found that the syndrome in mice can be treated during the mother’s gestation by administering the missing ectodysplasin—the first demonstration that a structural birth defect could be prevented with a medical approach. Now, a UC San Francisco (UCSF) physician who treats birth defects affecting the face has teamed up with a European expert on animal evolution to see if the same biochemical pathway also could be manipulated to study evolution. In an analysis conducted through computer modeling and the measurement of teeth from different mammals, the scientists showed how both rodents and carnivores—including later-evolving mammals such as the lion, wolf, and bear—follow the same rule of shape change suggested by the experiment. The study was published in the journal Nature.

View UCSF Data Science Press Release

July 30, 2014 — UC San Diego’s WIFIRE project helps firefighters get a jump on wildfires

WIFIRE project helps firefighters
New cyberinfrastructure system monitors and forecasts wildfire activity. Credit: WIFIRE/UC San Diego
SDSC 7/30/2014—With a multi-year, $2.65 million grant from the National Science Foundation (NSF), UC San Diego and the University of Maryland have been building a cyberinfrastructure to better monitor, predict, and mitigate wildfires in the future. The project, called WIFIRE and started in late 2013, is already cataloging and integrating data related to dynamic wildfire models from a variety of resources including sensors, satellites, and scientific models, and creating visual programming interfaces for using that data in scalable wildfire models. The project will hold its first workshop for dynamic-data driven wildfire modeling in January 2015. The system will integrate networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modeling, and data assimilation to provide a scalable method to monitor such phenomena similar to weather patterns that can help predict a wildfire’s rate of spread.

View SDSC Data Science Press Release

July 29, 2014 — 1996 research article deemed a classic paper

1996 paper on personalizing searches now classic
Michael Pazzani is the vice chancellor for research and economic development at UC Riverside. Credit: Carlos Puma
UCR 7/29/2014—A 1996 research paper authored by UC Riverside’s Michael J. Pazzani and two colleagues has been selected by the Association for the Advancement of Artificial Intelligence (AAAI) to win the 2014 Classic Paper Award. The AAAI, which promotes theoretical and applied artificial intelligence research, established the award in 1999 to honor authors of papers, chosen from a specific conference year, that were deemed most influential. The research paper is “Syskill & Webert: Identifying Interesting Web Sites,” published in the proceedings of The Thirteenth National Conference on Artificial Intelligence (AAAI-96). Pazzani’s coauthors are Jack Muramatsu and Daniel Billsus. Their paper showed how a profile of the user can be learned by any learning algorithm from a user’s feedback on any web page and how this profile can be used to predict the user’s interest in web pages. By combining the system with a search engine, the paper showed how search results can be personalized and how a query can be constructed to search for content that interests the user. The research paper directly led to commercial applications, and today the personalization of content on the internet may be the most common application of artificial intelligence encountered by the average person.

View UCR Data Science Press Release

July 22, 2014 — Three Los Alamos scientists named ‘Most Influential Scientific Minds’

Modeling blocks to Hepatitis C virus
Left to right: Bette Korber, Alan Perelson and Allison Aiken
LANL 7/22/2014—Los Alamos National Laboratory (LANL) scientist Alan Perelson is one of three LANL researchers named to Thomson Reuters list of “The World’s Most Influential Scientific Minds.” Perelson—a Senior Fellow of the Laboratory’s Theoretical Biology and Biophysics group, an adjunct professor of bioinformatics at Boston University, an adjunct professor of biology at the University of New Mexico, and an adjunct professor of biostatistics at the University of Rochester’s School of Medicine—is part of a multinational team whose work contributed to the understanding of the Hepatitis C virus and a possible cure. In that work, a mathematical technique called “viral kinetic modeling” seeks to characterize the main mechanisms that govern the virus’s response to treatment. These computer simulations showed the drug in question blocked two distinct processes (like other antivirals), but also the release of the virus from infected cells. The research results indicate that daily viral production could be four times larger than previously thought, which has implications for the development of mutations that could lead to drug resistance.

View LANL Data Science Press Release

July 21, 2014 — UCLA faculty contribute to international study on the biology behind schizophrenia

Schizophrenic risk: 108 locations on human genome
UCLA 7/21/2014—UCLA researchers were part of a multinational effort that has identified more than 100 locations in the human genome associated with the risk of developing schizophrenia. The study, the largest genomic study published on any psychiatric disorder to date, provides important new insights about the biological causes of schizophrenia, and it could lead to new approaches to treatment. The authors analyzed more than 80,000 genetic samples from people with and without schizophrenia. They identified 108 specific locations in the human genome associated with risk for schizophrenia, 83 of which had not previously been linked to the disorder. The study was conducted over several years by the Schizophrenia Working Group of the Psychiatric Genomics Consortium; it included 55 datasets from more than 40 different contributors. The report was published in the July 22 online edition of the journal Nature.

View UCLA Data Science Press Release

July 18, 2014 — Scientists enlist big data to guide conservation efforts

Big Data pinpoints best wildlife preserve sites
Using data on Australia’s acacia trees, the new model maps areas of endemism—the rainforests of southwest Western Australia, the Gascoyne region and Tasmania—where conservation efforts might preserve rare and endangered species.
UCB 7/18/2014—Despite a deluge of new information about the diversity and distribution of plants and animals around the globe, “big data” has yet to make a mark on conservation efforts to preserve the planet’s biodiversity. But that may soon change. A new model developed by UC Berkeley biologist Brent Mishler and his colleagues in Australia leverages this growing mass of data—much of it from newly digitized museum collections—to help pinpoint the best areas to set aside as preserves and to help biologists understand the evolutionary history of life on Earth. The model takes into account not only the number of species throughout an area—the standard measure of biodiversity—but also the variation among species and their geographic rarity, or endemism. The model, which requires intense computer calculations, is described in the journal Nature Communications.

View UCB Data Science Press Release

July 18, 2014 — Pair awarded NSF grant to study ‘crowdprogramming’

UCI 7/18/2014—Informatics professor André van der Hoek and postdoctoral scholar Thomas LaToza have received a four-year, $1.4 million grant from the National Science Foundation for their research into what they call “crowdprogramming.” Crowdsourcing leverages the power of mass input by individuals to complete tasks that were previously too labor-intensive to be feasible; van der Hoek and LaToza propose applying those same principles to software development. The pair will explore whether crowdprogramming can be achieved and, if so, in what form, under what conditions, and with what benefits and drawbacks. They will also create a publicly available platform, CrowdCode, that will offer a tool set specifically designed to address the intricacies of crowdprogramming.

View UCI Data Science Press Release

July 18, 2014 — UCLA-led effort to improve medical computing gets $3M boost from public-private partnership

Custom computational medicine improves safety
How domain-specific computing can reduce radiation risk from CT scans. Credit: Center for Domain-Specific Computing
UCLA 7/18/2014—A group that designs high-performance, customizable computer technologies to improve health care has received a $3 million grant from a public-private partnership between the National Science Foundation and semiconductor giant Intel Corp. The award will help further the long-term efforts of the group, which aims to speed up the computing side of medicine through innovations in what is known as domain-specific computing. Their research has the potential to reduce dangerous radiation exposure during CT scans and lead to the development of patient-specific cancer treatments. In developing their health care computing innovations, the UCLA-led researchers have focused on domain-specific computing, which has significant advantages over general-purpose computing in medical applications. In domain-specific computing, researchers create custom hardware that can solve a range of related problems within a particular area or domain with high efficiency and flexibility.

View UCLA Data Science Press Release

July 15, 2014 — Drought impact study: California agriculture faces greatest water loss ever seen

Nation’s produce basket in danger of running dry
Managing groundwater reserves is key to California’s surviving long-term drought. Here, Senior Engineering Geologist Chris Bonds from the California Department of Water Resources monitors flow rate from a well. Credit: John Chacon/California Department of Water Resources
UCD 7/15/2014—California produces nearly half of U.S.-grown fruits, nuts and vegetables plus nearly a quarter of the nation’s milk and cream; but the nation’s produce basket may come up dry in the future if groundwater reserves continue to be treated like an unlimited savings account. So concluded a new study by the UC Davis Center for Watershed Sciences, released at a press briefing in Washington, D.C. The study updates estimates on the prolonged drought’s effects on Central Valley farm production, presents new data on the state’s coastal and southern farm areas, and forecasts the drought’s economic fallout through 2016. Groundwater reserves are being used to replace surface water losses. If the drought continues for two more years, however, pumping ability will slowly decrease, while costs and losses will slowly increase as the groundwater is depleted. To forecast the economic effects of the drought, the UC Davis researchers used computer models, remote satellite sensing data from NASA, and the latest estimates of State Water Project, federal Central Valley Project and local water deliveries and groundwater pumping capacities.

View UCD Data Science Press Release

July 14, 2014 — FAA approves data drone research at UC Merced

FAA grants UCM second permit for data drones
UCM 7/14/2014—In UC Merced’s well-established program on unmanned aerial systems (UAS) and their applications, faculty members and students are working on scientific data drones that can patrol wildfire perimeters, collect water samples, monitor pest situations in agricultural fields, check soil and crop conditions and much more. The FAA has granted a second certificate to allow researchers to fly drones over the Merced Vernal Pools and Grassland Reserve adjacent to campus to test multi-spectrum mosaic imaging to make the most comprehensive aerial survey of the property yet. Using the drones’ super high-resolution pictures (down to several millimeters), researchers can look at plant health, growth and stress. Professor YangQuan Chen Chen has a vision for a research institute like UC Solar, where people from multiple UC campuses and disciplines work on various projects involving UAS.

View UCM Data Science Press Release

July 11, 2014 — Berkeley Lab wins three 2014 R&D 100 Awards

3D screening of sick cells
LBNL 7/11/2014—The U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (LBNL) has won three 2014 R&D 100 awards. This year’s winners include a bioinformatics platform for screening 3-D cell culture models. Called BioSig3D, it is a computational platform for high-content screening of three-dimensional cell culture models that are imaged in full 3-D volume. It is primarily used for the study of aberrant organization that is typically caused by cancer, as well as the evaluation and quantification of the effects of exposure to radiation and environmental toxins. Presented by R&D Magazine, the R&D 100 Awards recognize the year’s top 100 technology products from industry, academia, and government-sponsored research, ranging from chemistry to materials to biomedical breakthroughs.

View LBNL Data Science Press Release

July 11, 2014 — Tapping real-time financial data can improve economic policymaking

Real-time $$ data = better economic policymaking
Researchers found spending jumps on the day payments come in, and stay high for several days, reflecting the convenience of paying big bills then. The chart above shows fast-food and coffee shop spending is pretty level all month.
UCB 7/11/2014— Measuring the nation’s economic health has long been a slow, costly and imprecise exercise. Traditionally, economic analysts have been forced to rely on large-scale surveys such as the Consumer Expenditure Survey or the Panel Study of Income Dynamics. But such surveys are complex and expensive to implement, so they are conducted infrequently and with modest-sized samples, with results released after substantial time lags. Because it is now possible to obtain fast, accurate, reliable, detailed, real-time and comprehensive information about daily personal financial spending and saving actions, government agencies and research organizations can capture rapidly-changing economic information generated by households and businesses and adjust course if necessary, UC Berkeley researchers said in a study published in Science. Emmanuel Saez, economist and head of UCB’s Center for Equitable Growth, called the study “pathbreaking” for its use of a dataset with real time and detailed information on incoming and spending, could vastly improve economic policymaking. “This is a great example,” he said, “of how new technologies are generating new big data that can also have incredible value for scientific research.”

View UCB Data Science Press Release

July 8, 2014 — DARPA taps Lawrence Livermore to develop world’s first neural device to restore memory

Implantable chip to restore memory loss?
Lawrence Livermore engineer Vanessa Tolosa holds up a silicon wafer containing micromachined implantable neural devices.
LLNL 7/8/2014—The Department of Defense’s Defense Advanced Research Projects Agency (DARPA) awarded Lawrence Livermore National Laboratory (LLNL) up to $2.5 million to develop an implantable neural device with the ability to record and stimulate neurons within the brain to help restore memory, DARPA officials announced this week. The research builds on the understanding that memory is a process in which neurons in certain regions of the brain encode information, store it and retrieve it. Certain types of illnesses and injuries, including Traumatic Brain Injury (TBI), Alzheimer’s disease and epilepsy, disrupt this process and cause memory loss. TBI, in particular, has affected 270,000 military service members since 2000.

View LLNL Data Science Press Release

July 8, 2014 — Study shows design standards for dams are effective for earthquakes

Dam soil filters earthquake-safe say 3D models
A technician prepares an experiment in the U.S. Army Corps of Engineers’ Engineer Research and Development Center’s Centrifuge Research Center. The centrifuge, the world’s most powerful, used a one-foot-high model dam and subjected it to 30 times the force of gravity to mimic a 30-foot dam and generate data to validate Lawrence Livermore National Laboratory's numerical tools and simulations.
LLNL 7/8/2014—Civil engineers have long been concerned that dams could fail days or weeks after an earthquake, even if no immediate evidence of a problem surfaced. Their concern has focused on possible cracks at the interface between the concrete section of a dam and the soil embankments at the dam’s sides, and on how the soil filters nestled amidst the embankments would fare. Since soil filters were instituted, their design standards have been based on experimental studies without detailed and validated computer modeling of the soil grains—until now. For the first time, under a collaboration between Lawrence Livermore National Laboratory (LLNL) and the U.S. Army Corps of Engineers’ Engineer Research and Development Center (ERDC), researchers have completed a study demonstrating the effectiveness of soil filters at the soil grain scale. They modeled soil filters at the grain level in 3D and then bridged the interaction of soil erosion to the behavior of the dam itself. The collaboration’s computer simulations, validated by experimental tests run by the ERDC, show that the dam filters meeting today’s existing standards for reservoirs are effective for protection.

View LLNL Data Science Press Release

July 7, 2014 — Two UC Santa Cruz earth scientists honored by American Geophysical Union

AGU honors computational geophysicist
Gary Glatzmaier. Credit: J. MacKenzie
UCSC 7/7/2014—The American Geophysical Union (AGU) has awarded medals to two UC Santa Cruz scientists in recognition of their breakthrough achievements in Earth science. One of them, Gary Glatzmaier, professor of Earth and planetary sciences at UCSC, will receive the AGU’s John Adam Fleming Medal recognizing his research on Earth’s magnetic field and the geodynamo in Earth’s core that maintains it. Glatzmaier developed the first dynamically consistent computer simulations of the geodynamo, revealing the origin of the geomagnetic field and helping to explain reversals of the magnetic field seen in the geologic record. He has also developed numerical models to study the interiors of the sun and other stars, as well as giant gas planets like Jupiter and Saturn.

View UCSC Data Science Press Release

July 7, 2014 — SDSC’s new ‘SeedMe’ resource invites researchers to rapidly share results

SeedMe: DropBox-plus for computational science
SDSC 7/7/2014— A team of SDSC researchers has developed a working implementation of SeedMe, short for ‘Swiftly Encode, Explore, Disseminate My Experiments.’ While SeedMe could be described in general terms as a DropBox for science, the computational research community requires several additional capabilities and integration tools. The team developed a web-based architecture that not only supports videos and other visualizations, but content such as plots, files, and tickers—all of which are essential to scientific research. The goal is to offer a means for scientists to seamlessly share and stream data-intensive visualizations on a variety of platforms, including mobile devices, to enable rapid sharing of content directly from applications running on HPC- or cloud-based resources. SeedMe is unique from other web-based interfaces in that it is completely focused on efficiently sharing scientific content.

View SDSC Data Science Press Release

July 7, 2014 — SDSC’s kc claffy receives annual IEEE Internet Award

kc claffy receives IEEE Internet Award
kc claffy. Credit: Ben Tolo, SDSC
SDSC 7/7/2014—kc claffy, the principal investigator and co-founder of the Center for Applied Internet Data Analysis (CAIDA) based at the San Diego Supercomputer Center (SDSC) at UC San Diego, has been awarded the latest IEEE (Institute of Electrical and Electronics Engineers) Internet Award. claffy was recognized by the IEEE for her “seminal contributions to the field of Internet measurement, including security and network data analysis, and for distinguished leadership in and service to the Internet community by providing open-access data and tools.”

View SDSC Data Science Press Release

July 6, 2014 — Discovery provides insights on how plants respond to elevated CO2 levels

Coping with higher CO2? Plants hold their breath
The discovery could provide agricultural scientists with new tools to engineer crops that can deal with droughts and high temperatures. Credit: Peter Trimming
UCSD 7/6/2014—Biologists at UC San Diego have solved a long-standing mystery concerning the way plants reduce the numbers of their breathing pores in response to rising carbon dioxide (CO2) levels in the atmosphere. Their discovery should help biologists better understand how the steadily increasing levels of CO2 in our atmosphere (which last spring, for the first time in recorded history, remained above 400 parts per million) are affecting the ability of plants and economically important crops to deal with heat stress and drought induced by climate change. In a paper published in this week’s early online edition of Nature, they report the discovery of a new genetic pathway in plants, made up of four genes from three different gene families that control the density of breathing pores—or “stomata”—in plant leaves in response to elevated CO2 levels. Using a combination of systems biology and bioinformatic techniques, the scientists cleverly isolated proteins, which, when mutated, abolished the plant’s ability to respond to CO2 stress.

View UCSD Data Science Press Release

July 3, 2014 — Ribosome research in atomic detail offers potential insights into cancer, anemia, Alzheimer’s

Rockin’ and rollin’ with the ribosome
The newly discovered rolling movement shown in (A) three-dimensional cryo-electron microscopy image of ribosome, and (B) computer-generated atomic-resolution model of the human ribosome consistent with microscopy. A). Arrows indicate the direction of movement during transition between the two different states. B). Ribbons represent backbone of RNA and protein molecules within the ribosome. Color bar indicates the amount of motion during rolling.
LANL, 7/3/2014—A groundbreaking study of the human ribosome is revealing that the tiny molecular machine is more versatile than previously understood. Ribosomes (found in all living cells) create proteins, making the ribosome one of life's most fundamental machines. This research shows that the ribosome is highly programmable, where minor changes in its sequencing can change its operation, allowing it to adapt to a changing environment, as described in a paper published July 3 in Cell. Specifically, the human ribosome shows subtle differences in overall structure that changes alter its inner workings, going from a molecular mechanism based on a ‘rocking’ motion to a ‘rolling’ motion. “Cracking the mechanism of human ribosomes will have applications to a variety of diseases, so we are now seeing the real payoff of over a decade of computer simulations of the ribosome,” said researcher Karissa Sanbonmatsu of Los Alamos National Laboratory (LANL).

View LANL Data Science Press Release

July 2, 2014 — ‘Deep learning’ makes search for exotic particles easier

UCI, 7/2/2014— Fully automated “deep learning” by computers greatly improves the odds of discovering particles such as the Higgs boson, beating even veteran physicists’ abilities, according to findings by UC Irvine researchers published today in the journal Nature Communications. Machine learning is a branch of computer science where, rather than computers being programmed to do a difficult task, computers learn automatically from examples. Currently, physicists devise by hand mathematical formulas that they apply to the data to derive the features they’re looking for, which are then fed to machine learning programs. However, by employing recent advances in deep learning, in which computers learn automatically at multiple processing levels, the UCI team eliminated the need for the time-consuming manual creation of those formulas in the search for these fleeting particles. In computer experiments using carefully structured simulated data, the UCI researchers’ methods resulted in a statistically significant 8 percent increase in the detection of these particles. Fully automated deep learning techniques could be employed in experiments scheduled for 2015 at the Large Hadron Collider.

View UCI Data Science Press Release

July 2, 2014 — SDSC assists researchers in novel wildlife tracking project

3D tracking of endangered wildlife
An image from a sequence of visualizations from analysis to identify home range in 3D of California condors. The blue and orange colored surfaces show probabilities of male and female birds, respectively. Credit: San Diego Supercomputer Center, San Diego Zoo, and U.S. Geological Survey
SDSC 7/2/2014—A team including researchers from the U.S. Geological Survey (USGS) and the San Diego Zoo’s Institute for Conservation Research has developed a novel methodology to monitor endangered wildlife. It is the first to combine 3D and advanced range estimator technologies to provide highly detailed data on the range and movements of terrestrial, aquatic, and avian wildlife species. Relying high-performance programming and computerized visualization expertise from researchers at the San Diego Supercomputer Center (SDSC) at UC San Diego, the team created highly detailed data sets and visualizations after they tracked three highly iconic but threatened species: California condors, giant pandas, and dugongs (a large marine animal somewhat similar to the manatee). Since 3D modeling is much more computationally data intensive than 2D, what started as a supercomputing challenge evolved into an exercise in optimizing codes to allow current problems of interest to be done on laptops or even smart phones. Gordon is being used to create interactive visualizations, and will make it easier for the software developers to explore the impact of algorithmic modifications on the quality of the solution.

View SDSC Data Science Press Release

July 1, 2014 — Nine junior faculty receive national recognition

UCR computational scientists win NSF CAREER grants
The CAREER grant is one of the most prestigious awards given out by the National Science Foundation.
UCR 7/1/2014—Nine researchers at UC Riverside have been awarded National Science Foundation (NSF) CAREER grants in July 2013–June 2014. One of NSF’s most prestigious awards, the CAREER grant is targeted towards promising new faculty early in their careers, with the goal of providing stable support while they establish themselves as independent researchers and exceptional educators in their fields. Two of the researchers were in computational science. Chia-en Chang, an assistant professor of chemistry and bioinformatics, will use computer modeling to investigate complex interactions among molecules, including proteins, enzymes and nanoparticles. Juhi Jang, an associate professor of mathematics, will investigate physically important phenomena, such as the collapse of stars or generation of vortices at the interface between two fluids, based on partial differential equations (PDE) approaches.

View UCR Data Science Press Release

June 30, 2014 — Up in flames: Evidence confirms combustion theory

Up in flames: combustion theory confirmed
Graphical representation of the chemistry in the early stages of soot formation. The mechanism to the right was demonstrated by experiment, while the one on the left was not. Credit: Dorian Parker, University of Hawaii
LBNL 6/30/2014—Researchers at the Department of Energy’s Lawrence Berkeley National Lab (LBNL) and the University of Hawaii have uncovered the first step in the process that transforms gas-phase molecules into solid particles like soot and other carbon-based compounds. For more than 30 years, scientists have developed computational models of combustion to explain how gas molecules form soot, but now Musahid Ahmed, scientist in LBNL’s Chemical Sciences Division, and his colleagues have data to confirm one long-standing theory. The finding could help combustion chemists make more-efficient, less-polluting fuels, and help materials scientists fine-tune their carbon nanotubes and graphene sheets for faster, smaller electronics. In addition, the results could have implications for the burgeoning field of astrochemistry, potentially establishing the chemical process for how gaseous outflows from stars turn into carbon-based matter in space.

View LBNL Data Science Press Release

June 30, 2014 — ‘Thirsty’ metals key to longer battery lifetimes

‘Thirsty’ metals lengthen battery lifetimes
LBNL/NERSC 6/30/2014—Today’s batteries simply do not hold enough charge. Replacing lithium with other metals with multiple charges could greatly increase battery capacity. But first researchers need to understand how to keep multiply-charged ions—ions that have gained or lost more than one electron—stable. Research has shown that when a multiply-charged aluminum or magnesium cation—a positively charged ion—encounters a single water molecule, the result can be explosive. The metal ion rips an electron from the water molecule, causing a molecular-level explosion, triggered by the Coulombic repulsion of the two positive charges on each fragment. But multiply-charged metal ions exist in water in countless ways, such as the calcium ions in a chocolate milkshake. Now, using supercomputers at the National Energy Research Scientific Computing Center (NERSC), a team of Pacific Northwest National Laboratory (PNNL) researchers determined the paths that lead to either the hydrolysis of water or the creation of stable metal ion clusters peaceably surrounded by water.

View LBNL,NERSC Data Science Press Release

June 25, 2014 — UC San Diego to launch new master’s program in data science and engineering

New data science/engineering master’s degree UCS
Computer science and engineering SDSC program to start in fall 2014
SDSC 6/25/14—UC San Diego has announced a new master’s degree program in Data Science and Engineering, intended for working professionals with a broad educational background and/or training in computer science, engineering, or mathematics. The Data Science and Engineering master’s degree program is taught by world-renowned professors and researchers from the Department of Computer Science and Engineering (CSE) in the UC San Diego Jacobs School of Engineering, in collaboration with the university’s San Diego Supercomputer Center (SDSC). The two-year, part-time program is completed in person on the UC San Diego campus and leads to a Master of Advanced Study (MAS) degree. “The data scientist is a new kind of professional, one who combines the skills of a software programmer, database manager, statistician, and communicator capable of creating mathematics-based models of data, identifying trends and deviations, and presenting them in compelling, visual ways,” said Rajesh Gupta, professor and chair in the Department of Computer Science and Engineering. “The very nature and process of scientific research is changing in the era of ‘big data’, and this innovative master’s program is our response to the growing demand for data scientists able to meet this new challenge.”

View SDSC Data Science Press Release

June 24, 2014 — Researcher calls report on economic impacts of U.S. climate change ‘like a flashlight at night’

Report on US climate change ‘flashlight at night
The graphic above indicates the estimated annual economic damages due to climate change in six key sectors of the United States economy.
UCB 6/24/14—UC Berkeley, economist and assistant professor of public policy Solomon Hsiang led the econometrics team that helped assemble a major report American Climate Prospectus: Economic Risks in the United States, released June 24, which projects significant economic risks from climate change in the United States. The first data-driven national study to provide local estimates for economic risks to key economic sectors, it found that: The hardest-hit areas will include heavily-used and increasingly expensive energy sources; labor productivity will also be hard hit as workers are sapped by the heat; infrastructure damage from rising sea levels will surge when combined with hurricane storms; and there is a roughly 50 percent chance that by conducting business-as-usual in terms of climate action, the nation’s climate-related mortality rates will rise to 1 to 3 times that of the motor vehicle mortality rate. While some ot the findings may sound sensational, the report itself is careful to present a balanced picture, said Hsiang, who called it “a hard-nosed risk analysis, produced as if the U.S. were run like a firm.”

View UCB Data Science Press Release

June 24, 2014 — NASA launches earth science challenges with OpenNEX cloud data

Citizen scientists challenged to use NASA big data
NASA satellite data incorporated into OpenNEX include global views of drought conditions. Green regions in this map of July 2012 are areas with more vegetation than an average July (2000–2013); red regions have less vegetation than average. Regions in black have no data due to clouds and snow. Credit: NASA Earth Exchange (NEX)
NASA Ames 6/24/14—NASA is launching two challenges to give the public an opportunity to create innovative ways to use data from the agency’s Earth science satellites. The challenges will use the Open NASA Earth Exchange (OpenNEX)—a data, supercomputing, and knowledge platform where users can share modeling and analysis codes, scientific results, and expertise—to solve big data challenges in the Earth sciences. A component of the NASA Earth Exchange, OpenNEX provides users a large collection of climate and Earth science satellite data sets, including global land surface images, vegetation conditions, climate observations and climate projections. The two challenges will allow citizen scientists to realize the value of NASA data assets and to offer NASA new ideas on how to share and use that data. The first “ideation” stage of the challenge, which runs July 1 through August 1, offers up to $10,000 in awards for ideas on novel uses of the datasets. The second “builder” stage, beginning in August, will offer between $30,000 and $50,000 in awards for the development of an application or algorithm that promotes climate resilience using the OpenNEX data, based on ideas from the first stage of the challenge. NASA will announce the overall challenge winners in December.

View AMES Data Science Press Release

June 23, 2014 — Big-data project nets professor a big award

DOE award to UCM for scientific big data analysis
UCM 6/23/14—Professor Florin Rusu’s passion for analyzing voluminous amounts of data has won him a prestigious early-career award from the federal Department of Energy (DOE). Rusu, with UC Merced’s School of Engineering, will receive $750,000 over the next five years for his project “Scalable and Energy-Efficient Methods for Interactive Exploration of Scientific Data,” in which he will research novel methods and algorithms to be integrated in a big-data system for scientific analysis. “The Department of Energy has many very interesting projects and generates massive amounts of data,” Rusu said, including astronomical observations. “They have observatories taking high-resolution pictures of the entire sky. Every pixel contains many megabytes of data, but how do you extract the useful information out of it?” Rusu’s project is designed to devise tools to make possible the inspection of as many hypotheses as scientists can come up with.

View UCM Data Science Press Release

June 19, 2014 — Lawrence Livermore, MIT researchers develop new ultralight, ultrastiff 3D printed materials

New ultralight, ultrastiff 3D printed materials
Lawrence Livermore Engineer Xiaoyu “Rayne” Zheng—lead author of the Science article—studies a macroscale version of the unit cell, which constitutes the ultralight, ultrastiff material. Photos by Julie Russell/LLNL.
LLNL 6/19/14—Imagine a material with the same weight and density as aerogel—a material so light it is sometimes called ‘frozen smoke’—but with 10,000 times more stiffness. Such a material could have a profound impact on the aerospace or automotive industries and other applications where lightweight, high-stiffness and high-strength materials are needed. Lawrence Livermore National Laboratory (LLNL) and Massachusetts Institute of Technology (MIT) researchers have developed a material with these properties using additive micro-manufacturing processes, designed with the aid of supercomputer modeling. The research team’s findings are published in the June 20 issue of Science. Titled “Ultralight, Ultrastiff Mechanical Metamaterials,” the article describes the team’s development of micro-architected metamaterials—artificial materials with properties not found in nature that maintain a nearly constant stiffness per unit mass density, even at ultralow density.

View LLNL Data Science Press Release

June 18, 2014 — A cure for Alzheimer’s requires a parallel team effort

Team simulates megafund to fight Alzheimer’s
Kenneth S. Kosik. Credit: Spencer Bruttig
UCSB 6/18/14—For the more than 5 million Americans and 35 million people worldwide suffering from Alzheimer’s disease, the rate of progress in developing effective therapeutics has been unacceptably slow. To address this urgent need, UC Santa Barbara’s Kenneth S. Kosik and colleagues are calling for a parallel drug development effort in which multiple mechanisms for treating Alzheimer’s are investigated simultaneously rather than the current one-at-a-time approach. In an article published today in Science Translational Medicine, the research team—Kosik, Andrew W. Lo and Jayna Cummings of the MIT Sloan School of Management’s Laboratory for Financial Engineering, and Carole Ho of Genentech, Inc.—presents a simulation of a hypothetical megafund devoted to bringing Alzheimer’s disease therapeutics to fruition. To quantify potential cost savings, the authors used projections developed by the Alzheimer’s Association and found that savings could range from $813 billion to $1.5 trillion over a 30-year period, more than offsetting the cost of a $38 billion megafund. Given that Alzheimer’s disease has the potential to bankrupt medical systems—the Alzheimer’s Association projects that the costs of care could soar to $1 trillion in the U.S. by 2050—governments around the world have a strong incentive to invest more heavily in the development of Alzheimer’s disease therapeutics and catalyze greater private-sector participation.

View UCSB Data Science Press Release

June 17, 2014 — High-performance computer system installed at Los Alamos National Laboratory

Wolf to model climate, materials, astrophysics
The Wolf computer system modernizes mid-tier resources for Los Alamos scientists.
LANL 6/17/14—Los Alamos National Laboratory (LANL) recently installed a new high-performance computer system, called Wolf, to be used for unclassified research. Wolf, manufactured by Cray Inc., has 616 compute nodes each with two 8-core 2.6 GHz Intel “Sandybridge” processors, 64 GB of memory, and a high speed Infiniband interconnect network. It utilizes the Laboratory’s existing Panasas parallel file system as well as a new one based on Lustre technology. The Wolf computing system operates at 197 teraflops per second. Collectively, the system has 9,856 compute cores and 19.7 terabytes of memory. It provides users with 86.3 million central processing unit core hours per year. Initial science research projects to utilize Wolf will include climate, materials, and astrophysics modeling.

View LANL Data Science Press Release

June 16, 2014 — The games genes play: Algorithm helps explain sex in evolution

Algorithm explains sex; genes at play
Scientists have identified an algorithm that governs the game genes play during sex. The algorithm helps explain how evolution has yielded the large diversity of life we see today. Credit: Dollarphotoclub
UCB 6/16/14—What do you get when you mix theorists in computer science with evolutionary biologists? You get an algorithm to explain sex. It turns out that 155 years after Charles Darwin first published On the Origin of Species, vexing questions remain about key aspects of evolution, such as how sexual recombination and natural selection produced the teeming diversity of life that exists today. The answer could lie in the game that genes play during sexual recombination, and computer theorists at UC Berkeley have identified an algorithm to describe the strategy used by these genes in this game. Their proposal, published June 16 in the online Early Edition of the Proceedings of the National Academy of Sciences, addresses the dueling evolutionary forces of survival of the fittest and of diversity.

View UCB Data Science Press Release

June 16, 2014 — Researchers develop efficient approach to manufacture 3D metal parts

Simulation-controlled 3D manufacturing
Direct metal laser melting (DMLM) machine in action: A laser fuses metal powder to form one of many successive layers that will form the final manufactured part.
LLNL 6/16/14—Selective laser melting (SLM) is a powder-based, additive manufacturing process where a three-dimensional (3D) mechanical part is produced, layer by layer, using a high-energy laser beam to fuse particles of metal powder. Some devices or mechanisms require parts that are very dense, with less than 1 percent porosity, as the pores or voids are the weakest part of the material and most likely would result in failure. A challenging problem in additive manufacturing has been selecting the appropriate process parameters to build up mechanical parts with desired properties. Lawrence Livermore National Laboratory (LLNL) researchers have how developed a new and more efficient approach to identify optimal parameters to print 3D high-density metal parts. Their paper, recently published in the International Journal of Advanced Manufacturing Technology, explains how parameters for higher-power SLM machines can be selected by using simple computational simulations to explore the process parameter space. The simulations compute the dimensions of the pool of liquid formed when the laser melts the metal powder particles.

View LLNL Data Science Press Release

June 16, 2014 — UCLA Engineering to lead NSF project to improve timekeeping for ‘Internet of Things’

Car 54, where are you?
Credit: HauChee Chung
UCLA 6/16/14— Timekeeping presents a particular challenge in the emerging field of cyber-physical systems (CPS)—often called the “Internet of Things”—in which objects and devices are equipped with embedded software and are able to communicate with and be controlled by wireless digital networks. Such cyberphysical devices depend on precise knowledge of time in order to infer location, control communications, and accurately coordinate activities in a broad and growing range of applications: autonomous cars and aircraft autopilot systems, advanced robotic and medical devices, energy-efficient buildings, and an array of other industrial initiatives. The National Science Foundation (NSF) has announced a $4 million “Frontier” award to a team based at UC Los Angeles that will tackle the challenge of timekeeping in cyberphysical systems. The research team’s Roseline project, headquartered at the UCLA Henry Samueli School of Engineering and Applied Science, will work to improve the accuracy, efficiency, robustness and security with which computers maintain their knowledge of physical time and synchronize it with such networked devices.

View UCLA Data Science Press Release

June 11, 2014 — Lawrence Livermore Lab awarded $5.6 million to develop next generation neural devices

Implantable brain stimulator for mental disorders
This rendering shows the next generation neural device capable of recording and stimulating the human central nervous system being developed at Lawrence Livermore National Laboratory. The implantable neural interface will record from and stimulate neurons within the brain for treating neuropsychiatric disorders.
LLNL 6/11/14—Lawrence Livermore National Laboratory (LLNL) recently received $5.6 million from the Department of Defense’s Defense Advanced Research Projects Agency (DARPA) to develop an implantable neural interface with the ability to record and stimulate neurons within the brain for treating neuropsychiatric disorders. The technology will help doctors to better understand and treat post-traumatic stress disorder, traumatic brain injury, chronic pain, and other conditions. LLNL and Medtronic are collaborating with UC San Franciscso, UC Berkeley, Cornell University, New York University, PositScience Inc., and Cortera Neurotechnologies on the DARPA SUBNETS project. Some collaborators will be developing the electronic components of the device, while others will be validating and characterizing it.

View LLNL Data Science Press Release

June 9, 2014 — Using Twitter to track flu, Lady Gaga

Healthsocialanalytics visualizes medical big data
Screenshot of healthsocialytics.com web site developed by Vagelis Hristidis and his team at UC Riverside.
UCR 6/9/14—Interested in the number of tweets about the flu in recent days, weeks or months? Whether the tweets are positive or negative? How they are dispersed geographically down to the street level? Words commonly used with flu? Or, even, predicting of the number of tweets about the flu in the coming days? Health Social Analytics (healthsocialytics.com) —a web site developed by Vagelis Hristidis, UC Riverside associate professor of computer science and engineering—visualizes data about health-related disorders, drugs and organizations from Twitter, news stories, and online health forums such as WebMD. “This is a tool that brings the power of social and news big data to your fingertips,” Hristidis said. The tool has potential applications for a wide range of groups. Government agencies could use it when preparing for a public health emergency, such as the H1N1 (swine) flu scare in 2009–2010. Drug companies could use it to check the sentiment and volume of online chatter related to their drugs versus ones of their competitors. Health psychologists could use to learn what keywords dominate health-related forums or which disorders have the biggest online communities. Media outlets could use it to track health trends.

View UCR Data Science Press Release

June 9, 2014 — ‘Erratic’ lasers pave way for tabletop accelerators

Tabletop particle accelerators?
3D Map of the longitudinal wakefield generated by the incoherent combination of 208 low-energy laser beamlets. In the region behind the driver, the wakefield is regular. Credit: Carlo Benedetti, Berkeley Lab
LBNL/NERSC 6/9/14—Making a tabletop particle accelerator just got easier. A new study shows that certain requirements for the lasers used in an emerging type of small-area particle accelerator can be significantly relaxed. Researchers hope the finding could bring about a new era of accelerators that would need just a few meters to bring particles to great speeds, rather than the many kilometers required of traditional accelerators. Conventional wisdom holds that many smaller lasers combined could create one ultrapowerful pulse. However, it was believed that the light from the smaller lasers would need to be precisely matched in color, phase, and other properties in order to produce the electron-accelerating motion within the plasma. But the new Lawrence Berkeley National Laboratory (LBNL) study—guided by theory and using computer simulations at the National Energy Research Scientific Computing Center (NERSC) to test various scenarios—revealed that erratic laser light could also work. The research was presented in the May special issue of Physics of Plasmas. View related LBNL press release

View NERSC Data Science Press Release

June 7, 2014 — Farming: A climate change culprit

Climate change culprit: Farming
The Sahel region is a narrow swath of semi-arid land that spans the African continent, from the Atlantic Ocean in the west to the Red Sea in the east. The low annual precipitation indicates the region is strongly reliant on the monsoon season for water supply.
LBNL/NERSC 6/7/14—Increased agricultural activity is a rain taker, not a rain maker, according to computer simulations of African monsoon precipitation run by researchers from Pacific Northwest National Laboratory, UC Los Angeles and the University of Texas. Using supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) and Oak Ridge Leadership Computing Facility, the research team fed observed land-use change data into a climate model and found the expansion of agriculture in the African Sahel region decreases summer rainfall through its impact on monsoon rains. The simulated decrease in summer rainfall reaches 10 percent over the Sahel, a region that is already stressed by water needs for human and ecological use. The study findings, which offer new insights into how land-use change may affect regional rainfall, were published in Climate Dynamics.

View LBNL, NERSC Data Science Press Release

June 4, 2014 — UCLA researchers create nanoscale structure for computer chips that could yield higher-performance memory

New nanoscale structure for computer chips
Illustration of a new structure developed by UCLA researchers for more energy-efficient computer chips. The arrows indicate the effective magnetic field due to the structure's asymmetry. Credit: UCLA Engineering
UCLA 6/4/14—Researchers at UC Los Angeles have created a nanoscale magnetic component for computer memory chips that could significantly improve their energy efficiency and scalability. The design brings a new and highly sought-after type of magnetic memory one step closer to being used in computers, mobile electronics such as smart phones and tablets, as well as large computing systems for big data. The innovative asymmetric structure allows it to better exploit electrons’ spin and orbital properties, making it much more power efficient than today’s computer memory.

View UCLA Data Science Press Release

June 2, 2014 — Case study: NAS tape-based solution to manage and store high-volume data

Keeping big data accessible forever
NASA Ames 6/2/14—This case study on the mass storage and archiving systems at the NASA Advanced Supercomputing (NAS) facility at NASA Ames Research Center looks specifically at NAS’s automated method of archiving data from disk to tape, which saves energy and space while still making data easily accessible to users—forever. The Active Archive Alliance is a collaborative industry organization that explores new technologies for enabling reliable, online, and efficient access to archived data.

View AMES Data Science Press Release

May 30, 2014 — Desert scientists turn to rainforest for climate answers

Filling in Amazon data for climate models
Los Alamos scientist Heath Powers, foreground, and on-site technician Vagner Castro work on field equipment for measuring carbon dioxide and water vapor near areas of human habitation in Brazil.
LANL 5/30/14—A team of scientists deployed to Brazil’s Amazon Basin is unraveling the mysteries of how land and atmospheric processes affect tropical hydrology and climate. Their work will go far toward improving the climate-prediction computer models on which scientists and policymakers rely for future climate-related planning. Their job is to go into climatically undersampled regions where there are not a lot of data and collect measurements aimed at fine-tuning climate modeling, both for more accurate design, and then for verification of conditions at specific locations. The experiment, Green Ocean Amazon (GOAmazon), has been underway since January 2014; nearly 100 collaborators from the US, Germany, and Brazil will be studying the rainforest through December 2015. The scientists and their technical assistants have been placing instruments at various sites in Brazil’s Amazon Basin, examining solar radiative energy, atmospheric moisture profiles, and measuring clouds’ and aerosols’ microphysical and chemical properties.

View LANL Data Science Press Release

May 26, 2014 — High-performance computing at Los Alamos announces milestone for key/value middleware

Milestone for massive computing in fine detail
Billion inserts-per-second data milestone reached for supercomputing tool.
LANL 5/26/14— A data middleware project has achieved a milestone for specialized information organization and storage. The Multi-Dimensional Hashed Indexed Middleware (MDHIM) project at Los Alamos National Laboratory (LANL) recently achieved 1,782,105,749 key/value inserts per second into a globally-ordered key space on LANL’s Moonlight supercomputer. In the current highly parallel computing world, the need for scalability has forced the world away from fully transactional databases and back to the loosened semantics of key value stores. Computer simulations overall are scaling to higher parallel-processor counts, simulating finer physical scales or more complex physical interactions. As they do so, the simulations produce ever-larger data sets that must be analyzed to yield the insights scientists need. Traditionally, much data analysis has been visual; data are turned into images or movies. Statistical analysis generally occurs over the entire data set. But more detailed analysis on entire data sets is becoming untenable due to the resources required to move/search/analyze all the data at once. The ability to identify, retrieve, and analyze smaller subsets of data within the multidimensional whole would make detailed analysis much more practical. In order to do achieve this, however, it becomes essential to find strategies for managing these multiple dimensions of simulation data. The MDHIM project aims to create a middle-ground framework between fully relational databases and distributed but completely local constructs like “map/reduce.”

View LANL Data Science Press Release

May 27, 2014 — CNEP researchers target brain circuitry to treat intractable mental disorders

UCB/UCSF/LLNL team for ambitious brain research
CNEP researchers plan to target malfunctioning neural circuits to treat intractable mental disorders. Credit: iStockphoto
UCB 5/27/14—Neuroscientists, engineers and physicians are teaming up for an ambitious five-year, $26 million project to develop new techniques for tackling mental illness. By using devices implanted in the brain, they aim to target and correct malfunctioning neural circuits in conditions such as clinical depression, addiction and anxiety disorders. The project is funded by the U.S. government’s Defense Advanced Research Projects Agency (DARPA) as part of its Systems-Based Neurotechnology for Emerging Therapies program. The heart of the project lies at the Center for Neural Engineering and Prostheses (CNEP), a UC Berkeley-UC San Francisco collaboration that kicked off in 2011 with a pioneering vision to use engineering techniques to repair neural circuits that have gone awry. Project members will be working under a collaborative agreement between UCSF and DARPA, and in conjunction with scientists from Lawrence Livermore National Laboratory (LLNL), which is receiving separate funding from DARPA as part of this research. The project opens up the possibility that maladaptive circuits can be permanently changed, essentially curing patients of their psychiatric disorders.

View UCB Data Science Press Release

May 27, 2014 — New venture aims to understand and heal disrupted brain circuitry to treat mental illnesses

Heal brain circuitry, cure mental illness?
UCSF 5/27/14—Scientists and physicians at UC San Francisco are leading a $26 million, multi-institutional research program in which they will employ advanced technology to characterize human brain networks and better understand and treat a range of common, debilitating psychiatric disorders, focusing first on anxiety disorders and major depression. The overall strategy is to first identify brain signaling pathways specifically associated with anxiety and depression, then to develop devices to provide precise stimulation therapies that guide the brain to strengthen alternative circuits. By leveraging the brain’s natural capacity for neural remodeling and learning, this approach will potentially allow the newly strengthened circuits to bypass the disease-associated signals and thereby eliminate symptoms. The project team will address psychiatric disorders in a new way, taking a “systems-level” approach. Instead of focusing on specific cellular processes to target with drugs, the researchers will seek to understand these disorders as disruptions of a complex network.

View UCSF Data Science Press Release

May 22, 2014 — CAFIN Spring Lecture: Finance at light speed, high-frequency trading

Finance at the speed of light
Terrence Hendershott
5/22/13 UCSC—Terrence Hendershott of the Haas School of Business at UC Berkeley spoke on “High Frequency Trading and the 2008 Short Sale Ban” for the Spring Lecture of the UC Santa Cruz Center for Analytical Finance (CAFIN). High-frequency trading and efforts to understand its consequences were recently the subject of the book Flash Boys. Hendershott was the visiting economist at the New York Stock Exchange in 2006 to 2007 and was a member of the Nasdaq Economic Advisory Board from 2004 to 2007 and chair in 2007. CAFIN is dedicated to exploring the challenge of improving financial intermediation in an interconnected, volatile world. Within CAFIN, economics professor Eric Aldrich has been working with Professor Gregory Laughlin, chair of the Astronomy and Astrophysics Department to devise fundamental new techniques for understanding the patterns of financial trading at high speeds. Their core insight is that the pattern of trading can be understood better by using trades as measures of time (“trade time”), rather than conventional “clock time.” This has immediate implications for broad topics such as market volatility, derivatives pricing, and potential for market failure.

View UCSC Data Science Press Release

May 21, 2014 — Scientists demonstrate improved catalyst control, energy savings could result

Enzyme-like switching molecular bonds in catalysts
Synthesis and schematic mechanical model of ligand stabilized open-site clusters. Credit: Nature Nanotechnology 9, 459–465 (2014)
UCB/UCD 5/21/14 — Inspired by how enzymes work in nature’s biological processes, researchers have demonstrated a way to improve control of synthetic catalysts. Catalysts accelerate chemical reactions so that they go faster and use less energy. In the research, sponsored by the U.S. Department of Energy, the scientists showed how to switch molecular bonding—the interaction that holds assemblies of atoms together—off and on at will at specific locations within the catalyst. The discovery, researchers said, has potentially profound implications for chemical conversions involving metal catalysts, including pollution abatement. A paper co-authored by Alexander Okrut and Alex Katz of UC Berkeley and Bruce Gates of UC Davis along with University of Alabama computational chemist David Dixon and his graduate student Shengjie Zhang was published in a recent online issue of the journal Nature Nanotechnology. View UCB/University of Alabama press release.

May 20, 2014 — UCSC data storage researchers tackle practical problems

Simplifying/speeding how “shingled” disks upda
Rekha Pitchumani, a UCSC graduate student in computer science, has been working with Seagate on new data storage technology. Credit: Carolyn Lagattuta
UCSC 5/20/14—Rekha Pitchumani, a computer science graduate student in the Baskin School of Engineering at UC Santa Cruz, has been developing new data storage technologies at UCSC’s Storage Systems Research Center (SSRC) in collaboration with Seagate, a leader in the storage industry. Pitchumani has been working on shingled disk technology, developed to enable computer disk drives to store more information by overlapping the data tracks like roof shingles. To write data on a shingled disk, however, a computer needs new software, and updating large server farms with shingled disks requires a labor-intensive software upgrade for each server. Pitchumani wanted to simplify that process by developing data management software that could be placed directly on a shingled disk. Such “intelligent” disks could then manage themselves, eliminating the software installation step in upgrading server storage.

View UCSC Data Science Press Release

May 19, 2014 — Greenland will be far greater contributor to sea rise than expected

Greenland ice far deeper threat to sea level rise
A glacier in the Sukkertoppen ice cap in southwest Greenland flows down a rocky canyon like those mapped in a new UCI-NASA study. Hundreds of previously unknown coastal canyons buried under the ice could contribute to far higher sea level rise than previously predicted. Credit: Michael Studinger/NASA
UCI 5/19/14—Greenland’s icy reaches are far more vulnerable to warm ocean waters from climate change than had been thought, according to new research by UC Irvine and NASA glaciologists. The work, published today in Nature Geoscience, shows previously uncharted deep valleys stretching for dozens of miles under the Greenland Ice Sheet. To obtain the results, UC Irvine associate project scientist Mathieu Morlighem developed a breakthrough method that for the first time offers a comprehensive view of Greenland’s entire periphery. To reveal the full subterranean landscape, he designed a novel “mass conservation algorithm” that combined the previous ice thickness measurements with information on the velocity and direction of its movement and estimates of snowfall and surface melt. The difference was dramatic. What appeared to be shallow glaciers at the very edges of Greenland are actually long, deep fingers stretching more than 100 kilometers (almost 65 miles) inland. “We anticipate that these results will have a profound and transforming impact on computer models of ice sheet evolution in Greenland in a warming climate,” the researchers conclude.

View UCI Data Science Press Release

May 19, 2014 — Wireless camera network offers new possibilities for security systems

Sun-powered “smart cameras” for remote spying
Computer engineering graduate student Kevin Abas designed the SWEETcam prototype.
UCSC 5/19/14—Advances in computer technology are opening up new possibilities for surveillance cameras and environmental video monitoring systems. Kevin Abas, a graduate student in computer engineering at UC Santa Cruz, used off-the-shelf components to build a prototype device for a solar-powered wireless network of smart cameras with potential applications in security systems and wildlife monitoring in remote areas. Abas described his system and reviewed similar designs developed by other researchers in a paper published as a cover feature in the May issue of the journal Computer, published by the IEEE Computer Society. ”The on-board components are getting so much smaller and more computationally powerful, it’s possible for a compact camera device to have a lot of data processing and other capabilities,” Abas said.

View UCSC Data Science Press Release

May 19, 2014 — Chemists challenge conventional understanding of how photocatalysis works

Sunlight + water = hydrogen fuel, but cheaply?
From right to left: Francisco Zaera, Yadong Yin and Christopher Bardeen. They are faculty members in the Department of Chemistry at UC Riverside. Credit: I. Pittalwala
UCR 5/19/14—Photocatalysis—catalysis assisted by light—is a promising route to convert solar energy into chemical fuels. Particularly appealing is the possibility to use photocatalysis to split water molecules into molecular hydrogen for use as a fuel to replace fossil fuels. Although photocatalysis has been around for many years, the search for viable photocatalysts to facilitate the splitting of water molecules continues to date. Photocatalysts are most often semiconductors, with precious metals such as platinum or gold added to promote their activity. However, such “promoter” metals are expensive. There is a need, therefore, to find more economical alternatives. Now a team of chemists at the University of California, Riverside has come up with a model to explain this promoting effect that could shift the focus in the search for substitutes of the metals and help identify better and more economical promoters for photocatalysis in the near future.

View UCR Data Science Press Release

May 15, 2014 — Funding From National Institutes of Health to Help Expand Data Storage Capacity on Campus

UCR receives NIH funding for a Big Data cluster
As computers have improved, growing storage and processing capacities have provided new and powerful ways to gain insight into the world by sifting through vast data sets. Credit: DARPA
UCR 5/15/14—Scientists at UC Riverside work on a variety of research topics critical to human health, such as genome biology, biomedical sciences, chemistry and computational biology. Next-generation sequencing and other high-throughput technologies routinely used in researching these topics generate vast amounts of data, increasing the need for high-performance computing. The campus has now received funding of $600,000 from the National Institutes of Health (NIH) to support data-intensive research — also often called Big Data science. Specifically, the grant will make possible the purchase of a complex instrument: a Big Data cluster with high-performance CPU resources and data storage space equivalent to 5,000 modern laptops.

View UCR Data Science Press Release

May 15, 2014 — UCSF 2.0: Converting data to knowledge, insight and action

Using your cellphone as a heart monitor
Jeffrey Olgin, MD, chief of the division of Cardiology, displays an application he developed as part of the Health eHeart Study to reduce heart disease that monitors vital signs during daily activities.
UCSF 5/15/14—UC San Francisco scientists and clinicians are working on many projects to convert data to knowledge, insight and action, one of the areas of innovation identified in the strategic planning effort known as UCSF 2.0. Cardiologist and electrophysiologist Jeffrey Olgin, MD, along with UCSF colleagues Mark Pletcher and Gregory Marcus, are leading the UCSF-developed Health eHeart study. Funded by the Salesforce.com Foundation and endorsed by the American Heart Association, the study aims to enroll 1 million people from around the world to use their smartphones to send their health data to study doctors so they can better understand how the heart functions and develop new ways to predict and prevent cardiovascular disease.

View UCSF Data Science Press Release

May 14, 2014 — Two Lawrence Livermore researchers awarded early career funding

Gamblin on ways to scale up supercomputer codes
Lawrence Livermore computer scientist Todd Gamblin will receive up to $2.5 million in funding to accelerate the adaptation of scientific simulation codes to increasingly powerful supercomputers.
LLNL 5/14/14—Todd Gamblin, a computer scientist in Lawrence Livermore National Laboratory’s (LLNL) Center for Applied Scientific Computing, will receive up to $2.5 million in funding over five years for a project to accelerate the adaptation of scientific simulation codes to increasingly powerful supercomputers. A process that currently can take up to six months for complex applications, increasingly complex machine architectures and applications are making this process even slower. Under a project entitled “Statistical Methods for Exascale Performance Modeling,” Gamblin proposes to develop statistical models of applications that can represent adaptive, data-dependent code behavior in a manner that can be scaled up for more powerful computing systems. In addition, the project will develop techniques to reduce the complexity of application models, so that application developers understand them.

View LLNL Data Science Press Release

May 14, 2014 — Protein Data Bank archives its 100,000th molecule structure

Protein Data Bank achieves major milestone
The number of structures available in the Protein Data Bank per year, as of May 14, 2014. Credit: wwPDB
SDSC 5/14/14—The Protein Data Bank (PDB) is the single worldwide repository for the three-dimensional structures of large molecules and nucleic acids that are vital to pharmacology and bioinformatics research. Recently, PDB archived its 100,000th molecule structure, doubling its size in just six years. Four data centers, including the San Diego Supercomputer Center (SDSC)/Skaggs School of Pharmacy and Pharmaceutical Sciences at UC San Diego, support online access to the three-dimensional structures of biological macromolecules. Those structures help researchers understand many facets of biomedicine, agriculture, and ecology, from protein synthesis to health and disease to biological energy.

View SDSC Data Science Press Release

May 14, 2014 — Central Valley groundwater depletion raises Sierra and may trigger small earthquakes

Lack of groundwater may trigger earthquakes
Central Valley groundwater depletion raises Sierra and may trigger small earthquakes. GPS measurements show that the Sierra Nevada and Coast Ranges rise several millimeters per year (red dots) as a result of groundwater pumping in the Central Valley (brown). Blue dots are sites where the ground has subsided.
UCB 5/14/14—Winter rains and summer groundwater pumping in California’s Central Valley make the Sierra Nevada and Coast Ranges sink and rise by a few millimeters each year, creating stress on the state’s earthquake faults that could increase the risk of a quake. Millimeter-precision measurements of elevation with improved continuous GPS networks and satellite-based interferometric synthetic aperture radar (InSAR measurements revealed a steady yearly rise of the Sierra of 1-2 millimeters per year. In response to the current drought, about 30 cubic kilometers (7.5 cubic miles) of water were removed from Central Valley aquifers between 2003 and 2010, causing a rise of about 10 millimeters (2/5 inch) in the Sierra over that time. Similar levels of periodic stress, such as that caused by the motions of the moon and sun, increase the number of microquakes on the San Andreas Fault, which runs parallel to the mountain ranges. If these subtle seasonal load changes are capable of influencing the occurrence of microquakes, it is possible that they can sometimes also trigger a larger event.

View UCB Data Science Press Release

May 14, 2014 — The state of rain

The state of rain: drought early warning
Spring–summer rainfall trends for the U.S.
UCSB 5/14/14—A new dataset developed in partnership between UC Santa Barbara and the U.S. Geological Survey (USGS) can be used for environmental monitoring and drought early warning. The Climate Hazards Group Infrared Precipitation with Stations (CHIRPS), a collaboration between UCSB’s Climate Hazards Group and USGS’s Earth Resources Observation and Science (EROS) couples rainfall data observed from space with more than three decades of rainfall data collected at ground stations worldwide. The new dataset allows experts who specialize in the early warning of drought and famine to monitor rainfall in near real-time, at a high resolution, over most of the globe. CHIRPS data can be incorporated into climate models, along with other meteorological and environmental data, to project future agricultural and vegetation conditions.

View UCSB Data Science Press Release

May 13, 2014 — Scientists reveal structural secrets of enzyme used to make popular anti-cholesterol drug

Why mutant enzyme makes cholesterol drug faster
Representation of the chemical structure of the mutated enzyme LovD9 Credit: UCLA/Codexis
UCLA 5/13/14—In pharmaceutical production, identifying enzyme catalysts that help improve the speed and efficiency of the process can be a major boon. Figuring out exactly why is an altogether different quest. In 2011, UCLA scientists and colleagues discovered that a mutated enzyme could help produce the cholesterol-lowering drug simvastatin (Zocor) far more efficiently than the natural, non-mutated version of the enzyme. But no one quite knew why, until another team of UCLA researchers cracked the mystery. Using a combination of experimental measurements and extensive computer simulations, the multidisciplinary team of researchers uncovered important structural features hidden in the modified enzyme that helped them unlock the secret of its efficacy.

View UCLA Data Science Press Release

May 12, 2014 — Fighting off virtual attacks

Fighting off virtual attacks
UC Irvine computer science professor Michael Franz has devised a way to individualize software programs to help keep hackers from inflicting widespread damage. Credit: Steve Zylius/UC Irvine
UCI 5/12/14—Imagine a cyber world in which hackers, identity thieves, spammers, phishers, foreign spies and other miscreants have a much tougher time plying their trade. Thanks to UC Irvine computer science professor Michael Franz and his research group, such a world is closer to a reality. Franz, director of UC Irvine’s Secure Systems & Software Laboratory, is borrowing the idea of “biodiversity” from nature and applying it to the software that runs on digital devices from smartphones to supercomputers. His promising ideas have already won a U.S. patent and he has been awarded more than $11 million as a principal investigator for UC Irvine—including more than $7 million as sole principal investigator—from the Defense Advanced Research Projects Agency, the U.S. intelligence community, the Department of Homeland Security, and other funding entities.

View UCI Data Science Press Release

May 9, 2014 — Encouraging cross-campus collaboration

New Initiative for Data Science at UC Irvine
The Water UCI initiative will harness the brainpower of university experts, government entities, public agencies, the private sector, and user groups to identify effective solutions to global water issues. Credit: Hoang Xuan Pham / UC Irvine
UCI 5/9/14—Multidisciplinary projects focused on data science, medical humanities and water will be funded under new Interschool Academic Initiative program, introduced in fall 2013 to identify, develop and support areas of multidisciplinary excellence. Recently, three new efforts were announced as part of that program: the Initiative for Data Science at UC Irvine, the Medical Humanities Initiative, and Water UCI. They join previously launched interschool initiatives in sustainability and exercise science. In an increasingly data-centric world, electronic information has become a critical element in modern research, education, medicine and business. The data science initiative will bring together faculty from almost every school to explore overlapping interests in the area of data science/big data, including methodology, infrastructure, theory, policy and education.

View UCI Data Science Press Release

May 8, 2014 — Laboratory researcher Joel Rowland to receive DOE Early Career Award

Los Alamos climate scientist wins DOE award
Joel Rowland
LANL 5/8/14 —Los Alamos National Laboratory researcher Joel Rowland is one of 35 national recipients of 2014 Early Career Research Program awards from the Department of Energy. Rowland’s research was recognized by DOE’s Office of Biological and Environmental Research for incorporating hydrological controls on carbon cycling in flood plain ecosystems into Earth System Models (ESM). Rowland has been a staff scientist since July 2010; his research focus has been on land surface dynamics in Arctic environments, with particular focus on how rivers and lakes in permafrost settings will respond to warming, permafrost loss and changes in hydrology. Rowland is part of the Laboratory’s Climate Ocean Sea Ice Modeling team on Arctic terrestrial hydrology and coupling terrestrial systems to ocean modeling. He also is part of DOE’s Next Generation Ecosystem Experiment Arctic research project team.

View LANL Data Science Press Release

May 8, 2014 — The revolution will be printed in 3-D

The revolution will be printed in 3-D
Three graduate students in architecture and urban design created a prototype of a 3-D-printed wrist splint, which could be used for quick, inexpensive and customized medical relief. Credit: UCLA Architecture and Urban Design
UCLA 5/8/14—Three-dimensional printing is an increasingly important tool for industry and research, and the terminology as well as the technology is creeping into the consumer market. But what is it? And how are UCLA faculty and students using it to create everything from bone splints to stunning fashion? The digital revolution has given us 24/7 access to every conceivable piece of information we might need (and much that we don't). You say you want a new revolution? Some believe we may be on the verge of one that's analogous: the ability to print anything, any time — not on paper, but in three dimensions. Shoes. Toys. Jewelry. Prosthetics. Pizzas. Apartments. (Yes, you can print yourself an apartment.)

View UCLA Data Science Press Release

May 6, 2014 — First-of-a-kind supercomputer at Lawrence Livermore available for collaborative research

New supercomputer to test industry big data apps
The Catalyst supercomputer at Lawrence Livermore employs a Cray CS300 architecture modified specifically for data-intensive computing. The system is now available for collaborative research with industry and academia.
LLNL 5/6/14—Catalyst, a first-of-a-kind supercomputer at Lawrence Livermore National Laboratory (LLNL), is available to industry collaborators to test big data technologies, architectures and applications. Developed by a partnership of Cray, Intel and Lawrence Livermore, this Cray CS300 high performance computing (HPC) cluster is available for collaborative projects with industry through Livermore's High Performance Computing Innovation Center (HPCIC). “Catalyst allows us to explore entirely new deep learning architectures that could have a huge impact on video analytics as well as broader application to big data analytics,” said Fred Streitz, director of the HPCIC.

View LLNL Data Science Press Release

May 6, 2014 — Berkeley Lab climate scientist: More extreme heat and drought in coming decades

Hotter, dryer world in coming decades
The best case (left) and worst case (right) scenarios considered in the National Climate Assessment both show temperature increases by the end of the century.
LBNL 5/6/14—By the end of this century, climate change will result in more frequent and more extreme heat, more drought, and fewer extremes in cold weather in the United States. Average high temperatures could climb as much as 10 or more degrees Fahrenheit in some parts of the country. These are some of the projections made by Lawrence Berkeley National Laboratory (LBNL) climate scientist Michael Wehner and his co-authors on the National Climate Assessment (NCA). “We have an even more thorough understanding of the human changes to the climate enabled in part by better computer models and their more complete representation of the climate system,” Wehner said.

View LBNL Data Science Press Release

May 1, 2014 — Edgy look at 2D molybdenum disulfide

Edgy look at 2D molybdenum disulfide
A new imaging technique allows rapid and all-optical determination of the crystal orientations of 2D semiconductor membranes at a large scale, providing the knowledge needed to use these materials in nanoelectronic devices.
LBNL—The drive to develop ultrasmall and ultrafast electronic devices using a single atomic layer of semiconductors has received a significant boost. Researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (LBNL) have recorded the first observations of a strong nonlinear optical resonance along the edges of a single layer of molybdenum disulfide. The existence of these edge states is key to the use of molybdenum disulfide in nanoelectronics, as well as a catalyst for the hydrogen evolution reaction in fuel cells, desulfurization and other chemical reactions.

View LBNL Data Science Press Release

April 19, 2014 — Saving crops and people with bug sensors

Saving crops and people with bug sensors
Yanping Chen, Eamonn Keogh, and Adena Why stand and hold insect sensor equipment. Credit: Peter Phun
UCR 4/19/14—UC Riverside researchers have created a method that can classify different species of insects with up to 99 percent accuracy, a development that could help farmers protect their crops from insect damage and limit the spread of insect-borne diseases, such as malaria and Dengue fever. Over the past 60 years, insect classification research has been limited by factors including an overreliance on acoustic sensing devices, a heavy focus on wingbeat frequency and limited data. The UC Riverside researchers overcame those limitations by building an inexpensive wireless bug sensor that can track many insect flight behavior patterns and generate much larger amounts of data that can then be incorporated into classification algorithms.

View UCR Data Science Press Release

April 16, 2014 — Los Alamos physicist honored with E.O. Lawrence Award

Los Alamos physicist receives E.O. Lawrence Award
John Sarrao
LANL 4/16/14 — Los Alamos National Laboratory physicist John Sarrao is being honored by the U.S. Department of Energy with the 2013 Ernest O. Lawrence Award in Condensed Matter and Materials Sciences. The citation reads: “For the discovery and study of new materials, especially those based on plutonium, advancing understanding of unconventional magnetic and superconducting states in strongly correlated f-electron condensed matter systems.” Sarrao has been the associate director for Theory, Simulation, and Computation at Los Alamos since March of 2013.

View LANL Data Science Press Release

March 16, 2014 — Climatologists offer explanation for widening of tropical belt

Climatologists analyze widening of tropics
This image shows the sea surface temperature anomaly in the Pacific Ocean from April 14–21, 2008. For detailed information, see http://bit.ly/1olgQpr . Credit: NASA/Remote Sensing Systems
UCR 3/16/14—The Earth’s tropical belt—demarcated, roughly, by the Tropics of Cancer and Capricorn—has progressively expanded since at least the late 1970s. A team of climatologists, led by researchers at UC Riverside, analyzed climate models used by the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (2014), several observational and re-analysis data sets, and conducted their own climate model experiments to quantify tropical widening, and to isolate the main cause. The team concludes that the recent widening of the tropical belt is primarily caused by multi-decadal sea surface temperature variability in the Pacific Ocean, due in part to anthropogenic pollutants. Their study was published March 16 in Nature Geoscience.

View UCR Data Science Press Release

April 29, 2014 — Atomic switcheroo explains origins of thin-film solar cell mystery

Atomic switcheroo: solar cell mystery solved!
Cross-sectional electron beam-induced current maps show the difference in cadmium telluride solar cells before (pictured above) and after (below) cadmium chloride treatment. The increased brightness after treatment indicates higher current collection at the grain boundaries.
NERSC/LBNL 4/29/14 — Scientists have known since the 1980s that treating cadmium-telluride (CdTe) solar cell materials with cadmium-chloride improves efficiency, but the underlying physics has remained a mystery until now. Combining electron microscopy with computer simulations run at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC), researchers have put this decades-long debate to rest. Beyond providing a long-awaited explanation, this finding could lead to a less-expensive, more easily fabricated, thin-film alternative to silicon-based photovoltaic solar cells.

View LBNL,NERSC Data Science Press Release

April 29, 2014 — Future effects

Oscar-winner prof: Academia births special effects
UCSB’s Theodore Kim gives his acceptance speech at the Academy of Motion Picture Arts and Sciences’ Scientific and Technical Achievement Awards. (Credit: Greg Harbaugh / ©A.M.P.A.S.)
UCSB 4/29/14 — What will movies look like 100 years from now? Has the prevailing effects technology known as CGI (computer-generated imagery) already peaked, or is it just getting started? And is it making high-end effects more accessible or isolating lesser-budgeted potential graphics auteurs? These are among the questions that Theodore Kim asks in the Plous Lecture at UC Santa Barbara. In 2013, Kim won an Academy Award in Technical Achievement in recognition of software that he (with three others) developed as a post-doc and that has become an industry-standard technique for smoke and fire effects. His Plous Lecture also revisits a point he made in his Oscar acceptance speech: that academia is the frequent, yet under-credited, birthplace of innovations that are enabling big changes in filmmaking.

View UCSB Data Science Press Release

April 23, 2014 — Calming plasma’s stormy seas

Simulating fusion: calm pesky turbulence?
Interior view of the ITER tokamak reactor under construction in Cadarache, France. In a tokamak, turbulence caused by microinstabilities in the plasma can significantly impact energy confinement. Image: ITER
NERSC/LBNL 4/23/14 — For decades, controlled nuclear fusion has held the promise of a safe, clean, sustainable energy source that could help wean the world from fossil fuels. But the challenges of harnessing the power of the Sun in an Earth-based nuclear fusion reactor have been many. One key technical issue that has long puzzled physicists commonly occurs in fusion reactions: turbulence inside a reactor can increase the rate of heat loss from the hot plasma, significantly impacting the resulting energy output. So researchers have been working to pinpoint both what causes plasma turbulence and how to control or even eliminate it to boost a fusion reactor’s energy output. Now simulations run at the National Energy Research Scientific Computing Center (NERSC) have shed light on a central piece of the puzzle: the relationship between fast ion particles in the plasma and plasma microturbulence.

View LBNL,NERSC Data Science Press Release

April 23, 2014 — Superconducting qubit array points the way to quantum computers

Quantum computing one bitty step closer
The five cross-shaped devices are the Xmon variant of the transmon qubit placed in a linear array. (Credit: Erik Lucero)
UCSB 4/23/14 — A fully functional quantum computer is one of the holy grails of physics. Unlike conventional computers, the quantum version uses qubits (quantum bits), which make direct use of the multiple states of quantum phenomena. When realized, a quantum computer will be millions of times more powerful at certain computations than today’s supercomputers. A group of UC Santa Barbara physicists has moved one step closer to making a quantum computer a reality by demonstrating a new level of reliability in a five-qubit array. Their findings appeared in the journal "Nature".

View UCSB Data Science Press Release

April 23, 2014 — Planning professor-turned-entrepreneur to help SF tackle urban problems with Big Data

Synthicity: Simulating solutions for San Francisco
UCB 4/23/14 — Paul Waddell, a city planning professor at UC Berkeley with a penchant for conducting research with what he calls “big urban data,” is putting his work to a real-world test in San Francisco. His high-tech startup Synthicity will work with the San Francisco Planning Department on new simulation, planning and urban development tools and technologies. Waddell has teamed up with specialists in 3D computer graphics and students in city and regional planning to develop UrbanCanvas, a new platform for designing and visualizing alternative development projects and zoning policies, in conjunction with a 3D visualization platform called GeoCanvas that interprets massive urban datasets quickly and interactively.

View UCB Data Science Press Release

April 22, 2014 — Why Earth matters to NASA: A conversation with Harrison Ford

Harrison Ford learns NASA climate change research
Harrison Ford (right) gets a close-up look at NASA research on deforestation and other global forest cover changes at Ames Research Center.
NASA Ames 4/22/14 — Actor Harrison Ford was on location at NASA’s Ames Research Center, Moffett Field, Calif., last November to film a segment of Showtime’s Years of Living Dangerously documentary on climate change. Ford toured the NASA Advanced Supercomputing (NAS) facility where he met with scientists Rama Nemani of Ames and Matthew Hansen of the University of Maryland, College Park, to learn more about how NASA satellite data, research, and technologies are used around the world to better understand and protect Earth. Analyzing massive datasets like global forest change cover over time is a unique capability of NASA’s Earth Exchange (NEX), a project based at NAS led by Nemani. NEX allows researchers to share knowledge, research, and tools to address global environmental challenges.

View AMES Data Science Press Release

April 18, 2014 — UCLA to host ‘Mathematics of Politics’ workshop, April 22–23

Data analysis reshaping political campaigns
UCLA professor Lynn Vavreck
UCLA 4/18/14 — A UCLA workshop April 22 and 23 called “Mathematics of Politics” will address how data analysis based on sophisticated mathematics is reshaping political campaigns. Sponsored by UCLA's Institute for Pure and Applied Mathematics, the workshop will bring together researchers and practitioners from mathematics, statistics, computer science, electrical engineering, political science and other fields.

View UCLA Data Science Press Release

April 18, 2014 — Student ‘hackers’ design new ways to research the Free Speech Movement

“Hacking” the Free Speech Movement
A four-student team designed this first-place interface for the FSM Digital Archive.
UCB 4/18/14 — In the half-century since UC Berkeley’s Free Speech Movement (FSM), the rousing oratory of Mario Savio and iconic images of mass demonstrations have come to stand in for the 1964 movement and its legacy. The FSM Digital Archive—a large trove of texts, images and audio recordings that the Bancroft Library digitized and published to the Web in the mid-1990s—was one of the Bancroft’s first forays into digital archiving. But it has technical limitations: Researchers can download and read files one at a time, but there’s no way to analyze the archive as a single data set as scholars in the “digital humanities” and “digital social sciences” are now doing — harnessing computer power to look for patterns across bodies of text, carry out computational analysis, or visualize information in new and compelling ways. For a “hackathon” called HackFSM, organized by the Bancroft Library and Digital Humanities @ Berkeley, students were invited to develop a user interface to make primary-source FSM materials more accessible to researchers and the public.

View UCB Data Science Press Release

April 14, 2014 — The mechanism of short-term memory

Brain’s big picture of short-term visual memory
Lester Ingber, UC San Diego Ph.D. '66
SDSC 4/14/14 — Using the Trestles supercomputer at the San Diego Supercomputer Center, researchers found for the first time that in-sync large-scale brain waves affecting various regions of the brain hold memories of objects just viewed. This study provides more evidence that large-scale electrical oscillations across distant brain regions may carry information for visual memories. For Lester Ingber, a researcher based in Ashford, Oregon who received a Ph.D. in physics from UC San Diego in 1966, those types of studies by other established researchers that have begun to emerge only in recent years highlight a concept he’s been investigating for more than 30 years: the relationship between large-scale, or “top-down” activities in the brain and short-term memory and consciousness.

View SDSC Data Science Press Release

April 14, 2014 — Proposals sought for next-generation supercomputing technologies

Wanted: extreme computing
LLNL 4/14/14 — The U.S. Department of Energy's (DOE) Office of Science and the National Nuclear Security Administration (NNSA) have issued a request for proposals to further develop "extreme scale" supercomputer technology. The latest request for proposals is now available on the Web. Contracts will total about $100 million, and the funding period will be from July 2014 to November 2016. Proposals, under what is being called FastForward 2, will be due May 9. FastForward seeks proposals from any company interested in working on extreme scale node architectures or memory technologies. Vendors from the computing industry are awarded research contracts to accelerate technologies critical to the development of next-generation, extreme scale supercomputers, which will provide the necessary simulation and computing capabilities to achieve DOE’s missions of national defense, scientific research and energy security.

View LLNL Data Science Press Release

April 14, 2014 — Can new understanding avert tragedy?

Climate change vs. global conflict
Integrating results from dozens of studies, Sol has shown strong links between temperature change and spikes in human conflict. (Credit: Photo Peg Skorpinski)
UCB 4/14/14 — Sol Hsiang applies complex statistical strategies to examine sobering, large-scale versions of the same basic question: How will environmental change affect life? Hsiang’s expertise lies in finding the connections between seemingly unrelated sources of data needed to answer disquieting questions: What is the likely impact of climate change on global patterns of conflict and on family survival? As research increasingly points to a global temperature rise in coming decades, dozens of forecasts examine the potential impacts on health, agriculture, water supply and migration. But beneath these disruptions lies the possibility of greatly increased localized crime and regional conflict. “One thing we found from looking at all these studies together that no one had pointed out is that unusually cold epochs and unusually warm periods are both associated with greater conflict,” said Hsiang.

View UCB Data Science Press Release

April 8, 2014 — Groundbreaking online registry to drive brain disease research

Register your brain
UCSF 4/8/14 — A new online project led by researchers at UC San Francisco promises to cut the time and cost of conducting clinical trials for brain diseases, while also helping scientists analyze and track the brain functions of thousands of volunteers over time. With easy online registration, the Brain Health Registry is designed to create a ready pool of research subjects for studies on neurological diseases, such as Alzheimer’s and Parkinson’s, as well as depression, post-traumatic stress disorder, and many other brain ailments. About one-third of the cost of running a clinical trial comes from having to recruit patients, and many trials fail or are delayed because of it. The Brain Health Registry is the first neuroscience project to use the internet on such a scale to advance clinical research.

View UCSF Data Science Press Release

April 7, 2014 — SDSC enables large-scale data sharing using Globus

Globus: “dropbox” for sharing Big Data
SDSC 4/7/14 — In the era of Big Data-enabled science, accessing and sharing of huge data sets plays a key role for scientific collaboration and research. Some San Diego Supercomputer Center (SDSC) users need to share large data sets with collaborators who may not have accounts on SDSC resources. Described as a “dropbox for science,” Globus is already widely used by resource providers and users who need a secure and reliable way to transfer files. SDSC is the first supercomputer center in the National Science Foundation’s XSEDE (eXtreme Science and Engineering Discovery Environment) program to offer the new and unique Globus sharing service.

View SDSC Data Science Press Release

April 4, 2014 — Scientists generate 3D structure for the malaria parasite genome

Targeting malaria in 3D
3D modeling of the human malaria parasite genome at one of the stages of its life cycle. Each color represents one of the 14 chromosomes of the parasite genome, the exception being purple (indicates genes known to be involved in virulence). (Credit: Le Roch Lab, UC Riverside)
UCR 4/4/14 — A research team led by a cell biologist at UC Riverside has generated a 3D model of the human malaria parasite genome at three different stages in the parasite’s life cycle—the first time such 3D architecture has been generated during the progression of the life cycle of any parasite. “Understanding the spatial organization of chromosomes is essential to comprehend the regulation of gene expression in any eukaryotic cell,” said Karine Le Roch, an associate professor of cell biology and neuroscience, who led the study. “Now we can more carefully search for components or drugs that can disrupt this organization, helping in the identification of new anti-malaria strategies.” According to the World Health Organization, an estimated 207 million people were infected with malaria in 2012 alone, leading to 627,000 deaths.

View UCR Data Science Press Release

April 3, 2014 — At Berkeley, experts mine questions of Big Data, power and privacy

Big Data, power, and privacy
Deirdre Mulligan: Looking at how information is “crossing borders and boundaries.”
UCB 4/3/14 — In response to President Obama’s call for a review of privacy issues in the context of increased digital information and the computing power to process it, UC Berkeley co-hosted an all-day workshop “Big Data: Values and Governance” with the Office of Science and Technology Policy (OSTP). Introducing the first of four panels of experts, Deirdre Mulligan, an assistant professor in the School of Information and co-director of the Berkeley Center for Law and Technology, compared today to the 1970s: another time marked by “breathtaking revelations of government overreaching into private lives, secret data banks with questionable legal authority and at times illicit goals that caught the population off-guard,” as well as “a deep commitment to secrecy.” “What we’re talking about here is power,” said UCB law professor Kenneth Bamberger. “In a world of Big Data, where data is flowing back and forth at all times, from multiple different sources, and no one knows, often, exactly how it’s being used, these questions of power can’t be overcome by simply individual choices.” See also the workshop program at http://www.ischool.berkeley.edu/newsandevents/events/2014bigdataworkshop

View UCB Data Science Press Release

April 3, 2014 — SDSC establishes data science workflows ‘center of excellence’

An easier way with WorDS
SDSC 4/3/14 — The San Diego Supercomputer Center (SDSC) at UC San Diego, has formally established a new ‘center of excellence’ to assist researchers in creating workflows to better manage the tremendous amount of data being generated across a wide range of scientific disciplines, from natural sciences to marketing research. A data science workflow is the process of combining data and processes into a configurable, structured set of steps that lead to automated computational solutions of an application. Called the WorDS Center (or Workflows for Data Science Center of Excellence), its purpose is to allow scientists to focus on their specific areas of research rather than having to solve workflow issues, or the computational challenges that arise as data analysis progresses from task to task.

View SDSC Data Science Press Release

April 2, 2014 — Americans using more energy according to Lawrence Livermore analysis

Americans guzzle more energy, emit more carbon
The 2013 energy flow chart released by Lawrence Livermore National Laboratory details the sources of energy production, how Americans are using energy and how much waste exists. (Credit: LLNL)
LLNL 4/2/14 — Americans used more renewable, fossil, and nuclear energy in 2013 than in 2012, according to the most recent energy flow charts released by Lawrence Livermore National Laboratory. Each year, the Laboratory releases energy flow charts that illustrate the nation’s consumption and use of energy. Overall, Americans used 2.3 quadrillion thermal units more in 2013 than the previous year. The Laboratory also has released a companion chart illustrating the nation’s energy-related carbon dioxide emissions. Americans' carbon dioxide emissions increased to 5,390 million metric tons, the first annual increase since 2010.

View LLNL Data Science Press Release

April 1, 2014 — April 1, 2014 — New applied modeling and simulation seminar video archive

Applied modeling/simulation seminar video archive
he NASA Advanced Supercomputing facility seminar series are recorded for on-demand playback on the web.
NASA Ames 4/1/14 — Hosted by the Applied Modeling and Simulation (AMS) Branch at the NASA Advanced Supercomputing facility, the NAS Applied Modeling and Simulation (AMS) seminar series presents talks on recent achievements, innovative tools, and current problems being faced by members of the modeling and simulation community from NASA, government, industry, and academia. Typically held weekly, the seminars are recorded for on-demand playback on the web in the AMS seminar archive, where visitors can view videos, abstracts and speaker biographies from recent seminars; find announcements of upcoming talks; and search a new archive of past seminar topics. More at
http://www.nas.nasa.gov/publications/ams/ams.html.

April 1, 2014 — The Meeting Point

Art and science symposium
MORPHLOW: created by a student in UCSB’s Media Arts and Technology Program using scientific, computational algorithms for biological rules translated into a physical object via 3D printer. (Credit: R.J. Duran)
UCSB 4/1/14 — Scholars have frequently suggested that art and science find their meeting point in method. That idea could be the tagline for the Interrogating Methodologies symposium taking place at UC Santa Barbara on April 18 and 19. The symposium explores boundaries in art and science and seeks to initiate conversation among specialists from the sciences, social sciences, humanities and arts, all of whom grapple with questions about how those communities intersect.

View UCSB Data Science Press Release

March 31, 2014 — Where to get Viagra news? (Really, this isn’t spam)

News about Viagra? (Really, this isn’t spam)
Vagelis Hristidis, an associate professor of computer science and engineering, led a study of how people use social networks to find health information
UCR 3/31/14 — Do you want information on Viagra or ibuprofen? Check out general social networks such as Twitter and Pinterest. Interested in sleep disorders or depression? You’re better off going to specialized health social networks such as WebMD or drugs.com. That is one of the findings of a paper “Pharmaceutical Drugs Chatter on Online Social Networks,” based on an analysis of more than 1 million drug-related posts, by a team of researchers led by Vagelis Hristidis at UC Riverside’s Bourns College of Engineering and Department of Political Science. The findings have implications for a wide range of stakeholders.

View UCR Data Science Press Release

March 26, 2014 — Here today, gone to meta

Here today, gone to meta
UCSB Library eyes digital curation service to help preserve research data created across campus.
UCSB 3/26/14 — With technology advancing at warp speed and data proliferating apace, can the scientific and scholarly output survive into the future? Enter Data Curation @ UCSB, an effort to address that very problem on campus. “There is now an expectation that if you are gathering data, it will be available right now, it will always be available, and it will be available in a digital form where I can use it,” said Greg Janée, a digital library research specialist at ERI and Data Curation @ UCSB project lead. “Keeping bits alive is a much higher-tech problem than keeping books alive,” added James Frew, professor of environmental informatics at UCSB’s Bren School of Environmental Science & Management and Janée’s research partner on the project. “Culture guarantees we’ll be able to read English in 300 years, but nothing guarantees we’ll be able to read a CD-ROM in even 10 years.” Now in its second year, the pilot project is using insights from a launch-year faculty survey to shape the inaugural iteration of a data curation service to be based at the UCSB Library.

View UCSB Data Science Press Release

March 25, 2014 — Could closing the high seas to fishing save migratory fish?

No fishing high seas: save wild fish, up profits
Global map of exclusive economic zones (green) and high seas (blue) oceanic areas (Credit: Courtesy photo)
UCSB 3/25/14 — Wild fish are in peril worldwide, particularly in international waters. Operating as a massive unregulated global commons where any nation can take as much as it wants, the high seas are experiencing a latter-day “tragedy of the commons,” with the race for fish depleting stocks of tuna, billfish and other high-value migratory species. A new paper by Christopher Costello, a professor of resource economics at UC Santa Barbara’s Bren School of Environmental Science & Management, and a coauthor, suggests a bold approach to reversing this decline: close the high seas to fishing. The researchers developed a computer simulation model of global ocean fisheries and used it to examine a number of management scenarios, including a complete closure of fishing on the high seas. The model tracked the migration and reproduction of fish stocks in different areas, and quantified the fishing pressure or activity, catch, and profits by each fishing nation under various polices. They found that closing the high seas could more than double both populations of key species and fisheries profit levels, while increasing fisheries yields by more than 30%.

View UCSB Data Science Press Release

March 27, 2014 — Human-induced climate change reduces chance of flooding in Okavango Delta

Computer models simulate African floods
This image is a compilation of three images from Envisat’s radar and shows where southwestern Africa’s Okavango River, which originates in Angola, empties into the inland Okavango Delta in northern Botswana. The (Image credit: European Space Agency)
LBNL/NERSC 3/27/14 — Researchers at the Lawrence Berkeley National Laboratory (LBNL), the University of Cape Town, and the United Nations Development Programme have analyzed how human-induced climate change has affected recent flooding in the Okavango River, an ecologically and geographically unique river basin in southern Africa. After seasonal rains fall in southern Angola, floodwaters flow slowly down the Okavango River into semi-arid northwestern Botswana, where the river spreads, floods, and eventually evaporates within the inland Okavango Delta. The annual floods of 2009, 2010, and 2011 all reached extents last seen decades ago. Were these unusually high floods related to human-induced climate change? In this study, a first of its kind carried out on the African continent, researchers addressed this question by producing a unique set of simulations generated by two computer models of the climate system.

View LBNL,NERSC Data Science Press Release

March 26, 2014 — Challenges, opportunities of Big Data Big Data’s exponential rise brings changes, challenges, and opportunities

Challenges, opportunities of Big Data
Faculty and industry leaders discussed the latest research, services, and education development in big data. (l to r) Michael Zeller, founder/CEO of San Diego-based Zementis; SDSC Director Michael Norman, SDSC’s PACE Director Natasha Balac, Larry Smarr, founding director CalIT2; and Stefan Savage, professor, Computer Science & Engineering, UC San Diego. (Image: Erik Jepsen, UC San Diego Publications)
SDSC 3/26/14 — Big data — defined as the gathering, storage, and analysis of massive amount of computerized data — is big business: a market growing at 45 percent annually that will reach $25 billion by 2015. Harnessing big data — and having the ability to extract meaningful information from it to advance scientific discovery — was a key topic during a recent symposium called “Big Data at Work: A Conversation with the Experts,” held at Atkinson Hall at UC San Diego’s Qualcomm Institute. “What’s different about big data in science now is the challenge of taking trillion of bytes of data that come from somewhere else, reading them into a supercomputer, and doing something with that data that contributes to new science and new knowledge,” said SDSC director Michael Norman.

View SDSC Data Science Press Release

March 19, 2014 — SDSC’s Gordon supercomputer assists in whole-genome sequencing analysis

Gordon supercomputer helps sequence whole genome
SDSC 3/19/14 — A recent whole-genome sequencing (WGS) analysis project, supported by the San Diego Supercomputer Center (SDSC) at UC San Diego, assisted in a major study of rheumatoid arthritis. The collaborative project, which also involved Janssen Research and Development, LLC and the Scripps Translational Science Institute, demonstrated the effectiveness of innovative applications of flash memory technology to rapidly process large data sets that are pervasive throughout human genomics research.

View SDSC Data Science Press Release

March 17, 2014 — As students embrace new ways to learn, library learns to adapt

Library adapts to students’ digital learning sty
Moffitt Library information gateway and reading lounge
UCB 3/17/14 — At Moffitt Undergraduate Library’s new information gateway and reading lounge, students have access to 2,000 new books, a top-notch computing experience, free digital scanning, reduced-price printing and — starting this semester — a laptop and digital-tablet check-out service.

View UCB Data Science Press Release

March 12, 2014 — NCEAS upgrades widely used scientific data repository

NCEAS upgrades popular scientific data repository
The upgraded KNB Data Repository features a new user interface and improved search function
UCSB 3/12/14 — With small labs, field stations and individual researchers collectively producing the majority of scientific data, the task of storing, sharing and finding the millions of smaller datasets requires a widely available, flexible and robust long-term data management solution. This is especially true now that the National Science Foundation (NSF) — and a growing number of scientific journals require authors to openly store and share their research data. In response, UC Santa Barbara’s National Center for Ecological Analysis and Synthesis (NCEAS) has released a major upgrade to the KNB Data Repository (formerly the Knowledge Network for Biocomplexity). The upgrade improves access to and better supports the data management needs of ecological, environmental and earth science labs and individual researchers.

View UCSB Data Science Press Release

March 11, 2014 — Making sense of ‘big data’

Making sense of ‘big data’
Ben Recht with graduate students Nick Boyd, Electrical Engineering and Computer Science and Ashia Wilson, Statistics. Photo: Peg Skorpinski
UCB 3/11/14 — Big data generates excitement because of its potential to contain answers to important questions. But as Ben Recht knows, often just coming up with the right questions to make sense of a mountain of data is tough. As an assistant professor of statistics and of electrical engineering and computer science, Recht’s work in the NSF-funded Algorithms, Machines, and People (AMP) Lab focuses on simplifying data analysis and developing strategies for solving problems common to many data investigations.

View UCB Data Science Press Release

March 6, 2014 — Preschoolers outsmart college students at figuring out gizmos

Preschoolers outsmart adults at figuring out gizmo
A new study shows children can sometimes outsmart grownups when it comes to figuring out how gadgets work because they’re less biased in their ideas about cause and effect. (Video by Roxanne Majasdjian and Philip Ebiner) https://www.youtube.com/watch?feature=player_embedded&v=bHQ0DemKcEA
UCB 3/6/14 — Preschoolers can be smarter than college students at figuring out how unusual toys and gadgets work because they’re more flexible and less biased than adults in their ideas about cause and effect, according to new research from UC Berkeley and the University of Edinburgh. Overall, the youngsters were more likely to entertain unlikely possibilities to figure out what makes a device work. That confirmed the researchers’ hypothesis that preschoolers and kindergartners instinctively follow Bayesian logic, a statistical model that draws inferences by calculating the probability of possible outcomes.

View UCB Data Science Press Release

February 27, 2014 — LLNL's ‘Science On Saturday’ lecture explains the cardioid project

Lecture explains the cardioid project
The Cardioid code developed by a team of Livermore and IBM scientists divides the heart into a large number of manageable pieces, or subdomains. The development team used two approaches, called grid (above) and Voronoi (not shown), to break the enormous computing challenge into much smaller individual tasks. See https://str.llnl.gov/Sep12/streitz.html
LLNL 2/17/14 — Computer modeling is a powerful tool for scientific inquiry when experiments are too costly, too dangerous or simply impossible. In this talk, presenters describe how to build a computer model of a human heart, starting from an individual cell and then using data from an actual person to create a realistic representation of a beating heart. Examples showed how doctors and researchers may soon be able to use such simulations to investigate the effects of new drugs on cardiac rhythms or improve the success rate of complex surgical procedures.

View LLNL Data Science Press Release

February 26, 2014 — February 26, 2014 — Twitter ‘big data’ can be used to monitor HIV and drug-related behavior

Twitter 'big data' monitors HIV, drug-related beha
UCLA 2/26/14 — Real-time social media like Twitter could be used to track HIV incidence and drug-related behaviors with the aim of detecting and potentially preventing outbreaks, a new UCLA-led study shows. The study, published in the peer-reviewed journal Preventive Medicine by authors from UCLA’s Center for Digital Behavior, suggests it may be possible to predict sexual risk and drug use behaviors by monitoring tweets, mapping where those messages come from and linking them with data on the geographical distribution of HIV cases. The use of various drugs had been associated in previous studies with HIV sexual risk behaviors and transmission of infectious disease.

View UCLA Data Science Press Release

February 21, 2014 — Samsung, UCSF partner to accelerate new innovations in preventive health technology

UCSF, Samsung partner for preventive health tech
Young K. Sohn, president and chief strategy officer of Samsung Electronics
Samsung Electronics Co., Ltd., and UC San Francisco have announced a partnership to accelerate validation and commercialization of promising new sensors, algorithms, and digital health technologies for preventive health solutions. The two organizations will jointly establish the UCSF-Samsung Digital Health Innovation Lab, a new space located in UCSF’s Mission Bay campus in San Francisco, where some of the world’s leading researchers and technologists will be able to develop and run trials to validate exciting new mobile health technologies. The joint innovation lab will be a first-of-its-kind test bed where entrepreneurs and innovators will be able to validate their technologies and accelerate the adoption of new preventive health solutions.

View UCSF Data Science Press Release

February 20, 2014 — Grad students to continue heritage research with fellowship

Grad students’ digital analysis of human heritag
Paola Di Giuseppantonio Di Franco will be at Cambridge University’s McDonald Institute for Archaeological Research
UCM 2/10/14 — Two UC Merced graduate students will continue their research into world heritage with the support of prestigious Marie Curie Intra-European Fellowships for Career Development. Fabrizio Galeazzi, who is developing ways to share 3D simulations of cultural heritage in real time over the Internet, will spend the next two years with the Centre for Digital Heritage, which is part of University of York’s Department of Archaeology. His research also includes studying various 3D technologies to understand their different benefits and limitations for the preservation and reconstruction of the past. Paola Di Giuseppantonio Di Franco will be at Cambridge University’s McDonald Institute for Archaeological Research. Her research looks at how people perceive ancient artifacts through different media such as photos, 3D simulations, 3D prints or original artifacts. She’s shown that people focus on different qualities of different objects depending on how they’re interacting with it, and that each medium produces a different level of engagement influencing the understanding of cultural heritage.

View UCM Data Science Press Release