at the
American Institute of Mathematics, Palo Alto, California
organized by
Ali Nadim and Ami Radunskaya
This workshop, sponsored by AIM and the NSF, will bring together mathematicians, graduate students, and industry and public agency representatives to work on mathematical modeling problems related to the planet earth. Questions that the workshop may address include understanding and mitigation of scourges such as desertification or forest fires, solar and wind energy, sustainability of water resources, remote sensing or environmental monitoring of earth processes, epidemics and spread/eradication of diseases among animals, plants or humans.
On the first day of the workshop, each problem will be described by an expert engineer or scientist who represents the industry or public agency and who is well versed in the problem area. Teams will then work intensively during the week on problem formulation, analysis, and implementation. Each team will give a formal presentation on their solution on the last day. The style of the workshop will be a blend of the format of the Math-in-Industry Study Group introduced in Oxford, and the focused, collaborative style of AIM workshops.
Problem 1. Using In Vitro Data to Predict Pathway-based Effects of Environmental Chemicals
Humans and ecological species are exposed to thousands of manmade chemicals throughout their lives. For humans, we see increasing rates of certain types of cancers, birth defects and reproductive issues that are hypothesized to be due to some of these exposures. The gold standard for understanding potential health effects of chemical exposure are experiments in animals. However, these studies are very expensive and time consuming. As a consequence only a small fraction of chemicals to which we are exposed have been thoroughly studied. A second issue with standard animal studies is that they do not provide all of the mechanistic information we need in order to understand the relevance of any findings, in rats for instance, to people. A promising alternative is to test chemicals in so-called ''high-throughput screening'' assays (HTS). These measure the effect of chemicals in cells (often human) or against particular proteins or other biomolecules in a quantitative, concentration-response way. This approach partially solves the two issues already mentioned, namely that of too many chemicals, and the need to generate information directly on human rather than test species cells. The problem is that the results do not automatically integrate all of the molecular and cellular effects that a chemical can cause. What is required next is a class of mathematical or computational models to perform this integration and therefore predict the potential for whole human adversity from the chemical effects at the molecular and cellular level, combined with knowledge of molecular and cellular pathways.
The overall modeling approach can be decomposed into a sequence of questions: Does a chemical perturb a given pathway P, and if so, what is the minimum concentration at which a perturbation is seen? How does perturbation of P affect the fate of cells in which P is active? What tissue or organ adverse effects arise from these cellular changes? We will focus on the first question using data from the ToxCast which is generating data on about 1800 environmental chemicals in hundreds of human pathway-based assays. The particular problem we will address is determining if a particular pathway is perturbed by integrating over multiple assays that probe the same pathway. The reason for using multiple assays is that each assay is somewhat ''noisy'' for either technical or true biological reasons. The goal is to then provide a probability value or a classification for whether each chemical perturbs the pathway at each tested concentration. Possible approaches are to use regression, machine learning or Bayesian approaches.
Problem 2. Attributing tropospheric ozone formation to precursor sources considering nonlinear chemistry
Tropospheric ozone is deleterious to both human and ecosystem health. It is produced through non-linear chemical reactions involving precursor emissions of nitrogen oxides (NOx) and volatile organic compounds. Comprehensive air pollution models include detailed treatment of the emissions, transport (advective, turbulent, clouds), and chemistry dictating the spatial and temporal distribution of O3 and related precursor species. From a policy standpoint a question of interest is: How much does an individual source contribute to the O3 at a given location and time? While many techniques ranging from simple brute-force methods to formal sensitivity analysis have been used, the "backward" attribution of O3 at a given location/time to its sources remains challenging due to the non-linearities in the system.
How does one compute the contribution (in terms of magnitude) of one source from a multitude of ground-level and aloft emission sources to the simulated concentrations of ozone (and related pollutants) at given grid locations and times in the presence of complex chemistry-meteorological non-linear interactions within a regional-scale numerical air quality modeling system?
Problem 3. Next generation thermal management of buildings
This project will explore next-generation, environmentally-friendly cooling and heating of buildings. With respect to cooling, since the average daily temperature in much of California is below 72 degrees, we could capture and store that cooling during the night hours for use during the hot hours so as to eliminate or reduce compressor-based cooling. Air conditioning drives peak electrical loads on our grid, which are super expensive because we are building power plants just for those peak times, so there is a very strong incentive to reduce these loads. Traditionally this was done with heavy stone buildings providing lots of thermal mass; more recently concrete buildings can do this, but they are not too common due to their expense. The workshop will explore the use of thermal energy storage tanks to store night-time cooling obtained via evaporation with a cooling tower (requiring 90% less energy than compressor based air conditioning). For next generation heating, we will analyze the German Passiv Haus concept which uses very good air-tightness and carefully sized insulation to minimize thermal transfer through the building envelope, and then relies mostly on internal heat gains (people, lights, computers, cooking) and some external heat gains (sun through windows) to keep the building warm. The latter component of the project involves data mining from very detailed Excel-based energy models from about 10,000 projects.
Problem 4. Developing algorithms to distinguish and classify locations based on their weather sensitivity
In today's electric system with aging infrastructure, evolving technologies, and increasing demand, industry is striving to optimize future grid expansion efforts. The strategic objective of power system planning is the development a long-term system infrastructure development strategy that minimizes total societal cost (i.e. avoids unnecessary ratepayer cost, undesirable environmental impact, etc...) while still maintaining an acceptable level of electric system reliability and safety. A critical component of power system expansion planning is the development of future demand forecasts. If a forecasting methodology is too conservative, it can result in unnecessary societal cost & environmental impact (i.e. more infrastructure than otherwise needed). If a forecasting approach is too optimistic, it can result in safety problems (failing or inadequate infrastructure, risk to public or utility workers, etc...).
Weather conditions have a significant impact on system demand. Since analysis of historical demand data is an element of future demand forecasting, but historical demand is impacted by historical weather conditions, it is important to be able to effectively normalize historical demand data to a common set of weather conditions. But this normalization can be difficult if different areas of the same electrical system planning area have significantly different degrees of correlation to weather conditions such as temperature. For example, it is possible for some locations to be "winter-peaking" (i.e. peak demand at low temperature), some locations "summer-peaking" (peak demand at high temperature) and some locations "temperature-independent" (i.e. no correlation). Can algorithms be developed to look at a large volume of historical demand and weather data and distinguish/classify locations based on their weather sensitivity, and develop locationally-appropriate optimized predictive models based on the different types of sensitivity?
The workshop will differ from typical conferences in some regards. Participants will be invited to suggest open problems and questions before the workshop begins, and these will be posted on the workshop website. These include specific problems on which there is hope of making some progress during the workshop, as well as more ambitious problems which may influence the future activity of the field. Lectures at the workshop will be focused on familiarizing the participants with the background material leading up to specific problems, and the schedule will include discussion and parallel working sessions.
The deadline to apply for support to participate in this workshop has passed.
For more information email workshops@aimath.org
Plain text announcement or brief announcement.
Go to the
American Institute of Mathematics.
Go to the
list of upcoming workshops.