Category Archives: Peer Reviewed Article Review

why our institutions are failing to deliver optimal green infrastructure

I think this article explains well why real green infrastructure is hard to achieve. Multiple goverhment agencies are responsible for bits and pieces of it, and even if they are acting efficiently within each of their limited missions, they are not coordinating to achieve goals efficiently as a whole. I see this plenty in my professional life dealing with water, parks and transportation agencies.

Lost in Transactions: Analysing the Institutional Arrangements Underpinning Urban Green Infrastructure

Urban development has altered surface-water hydrology of landscapes and created urban heat island effects. With climate change, increasing frequency of extreme heat events and in some areas, episodic drought and flooding, present new challenges for urban areas. Green infrastructure holds potential as a cost-effective means of providing microclimate cooling and stormwater diversion. Further, green open spaces when combined with the provision of equipment and facilities have the potential to promote physical and emotional well-being. However successful implementation may be predicated on co-ordinated efforts of multiple agencies. The Institutional Analysis and Development framework developed by Crawford and Ostrom is used in a case study to understand the institutional impediments, transaction costs and gaps in responsibility associated with the delivery of green infrastructure. Lessons learned are potentially transferable to other urban settings. Our analysis reveals areas of high transaction costs as well as a gap in the polycentric decision-making of agencies. The local government council is concerned with the well-being of its residents but has limited financial capacity. None of the agencies who deliver green infrastructure have responsibility for facilitating the indirect or preventative health benefits. Thus, a co-ordination problem among agencies can lead to suboptimal investments in green infrastructure.

 

evidence-based restoration

If ecosystem restoration hasn’t been based on evidence in the past, what has it been based on?

Evidence-based restoration in the Anthropocene—from acting with purpose to acting for impact

The recognition that we are in the distinct new epoch of the Anthropocene suggests the necessity for ecological restoration to play a substantial role in repairing the Earth’s damaged ecosystems. Moreover, the precious yet limited resources devoted to restoration need to be used wisely. To do so, we call for the ecological restoration community to embrace the concept of evidence-based restoration. Evidence-based restoration involves the use of rigorous, repeatable, and transparent methods (i.e. systematic reviews) to identify and amass relevant knowledge sources, critically evaluate the science, and synthesize the credible science to yield robust policy and/or management advice needed to restore the Earth’s ecosystems. There are now several examples of restoration-relevant systematic reviews that have identified instances where restoration is entirely ineffective. Systematic reviews also serve as a tool to identify the knowledge gaps and the type of science needed (e.g. repeatable, appropriate replication, use of controls) to improve the evidence base. The restoration community, including both scientists and practitioners, needs to make evidence-based restoration a reality so that we can move from best intentions and acting with so-called “purpose” to acting for meaningful impact. Doing so has the potential to serve as a rallying point for reframing the Anthropocene as a so-called “good” epoch.

Polarization, Partisanship and Junk News Consumption over Social Media in the US

Maybe this is just the Brits picking on us. Or, maybe they are onto something.

Vidya Narayanan, Vlad Barash, John Kelly, Bence Kollanyi, Lisa-Maria Neudert, and Philip N. Howard. “Polarization, Partisanship and Junk News Consumption over Social Media in the US.” Data Memo 2018.1. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk

What kinds of social media users read junk news? We examine the distribution of the most significant sources of junk news in the three months before President Donald Trump’s first State of the Union Address. Drawing on a list of sources that consistently publish political news and information that is extremist, sensationalist, conspiratorial, masked commentary, fake news and other forms of junk news, we find that the distribution of such content is unevenly spread across the ideological spectrum. We demonstrate that (1) on Twitter, a network of Trump supporters shares the widest range of known junk news sources and circulates more junk news than all the other groups put together; (2) on Facebook, extreme hard right pages—distinct from Republican pages—share the widest range of known junk news sources and circulate more junk news than all the other audiences put together; (3) on average, the audiences for junk news on Twitter share a wider range of known junk news sources than audiences on Facebook’s public pages.

I hadn’t heard the term computational propaganda before. Here is how they describe it:

The Computational Propaganda Research Project (COMPROP) investigates the interaction of algorithms, automation and politics. This work includes analysis of how tools like social media bots are used to manipulate public opinion by amplifying or repressing political content, disinformation, hate speech, and junk news.

We use perspectives from organizational sociology, human computer interaction, communication, information science, and political science to interpret and analyze the evidence we are gathering. Our project is based at the Oxford Internet Institute, University of Oxford.

So in other words, we are all being manipulated by some very old and tired ideas using powerful new technologies Hitler and Stalin could only have dreamed of.

precision nutrition

Lancet has an article on precision nutrition and diabetes. Precision nutrition is the idea of a diet tailored specifically to an individual based on analysis of factors such as their genetics, proteins, and gut bacteria.

Precision nutrition for prevention and management of type 2 diabetes

Precision nutrition aims to prevent and manage chronic diseases by tailoring dietary interventions or recommendations to one or a combination of an individual’s genetic background, metabolic profile, and environmental exposures. Recent advances in genomics, metabolomics, and gut microbiome technologies have offered opportunities as well as challenges in the use of precision nutrition to prevent and manage type 2 diabetes. Nutrigenomics studies have identified genetic variants that influence intake and metabolism of specific nutrients and predict individuals’ variability in response to dietary interventions. Metabolomics has revealed metabolomic fingerprints of food and nutrient consumption and uncovered new metabolic pathways that are potentially modified by diet. Dietary interventions have been successful in altering abundance, composition, and activity of gut microbiota that are relevant for food metabolism and glycaemic control. In addition, mobile apps and wearable devices facilitate real-time assessment of dietary intake and provide feedback which can improve glycaemic control and diabetes management. By integrating these technologies with big data analytics, precision nutrition has the potential to provide personalised nutrition guidance for more effective prevention and management of type 2 diabetes. Despite these technological advances, much research is needed before precision nutrition can be widely used in clinical and public health settings. Currently, the field of precision nutrition faces challenges including a lack of robust and reproducible results, the high cost of omics technologies, and methodological issues in study design as well as high-dimensional data analyses and interpretation. Evidence is needed to support the efficacy, cost-effectiveness, and additional benefits of precision nutrition beyond traditional nutrition intervention approaches. Therefore, we should manage unrealistically high expectations and balance the emerging field of precision nutrition with public health nutrition strategies to improve diet quality and prevent type 2 diabetes and its complications.

I don’t want to be cynical, but I can imagine a scenario where this technology really catches on, but is accessible only to the rich. The result would be the rich living much longer than the rest of us (and they already live longer).

quantifying ecological functions

Here is an interesting article on quantifying ecological functions. The main application appears to be wetland mitigation but the theory seems more general and could maybe be adapted to a variety of ecosystem restorations or creations.

Landscape consequences of aggregation rules for functional equivalence in compensatory mitigation programs

Mitigation and offset programs designed to compensate for ecosystem function losses due to development must balance losses from affected ecosystems and gains in restored ecosystems. Aggregation rules applied to ecosystem functions to assess site equivalence are based on implicit assumptions about the substitutability of functions among sites and can profoundly influence the distribution of restored ecosystem functions on the landscape. We investigated the consequences of rules applied to aggregation of ecosystem functions for wetland offsets in the Beaverhill watershed in Alberta, Canada. We considered the fate of 3 ecosystem functions: hydrology, water purification, and biodiversity. We set up an affect-and-offset algorithm to simulate the effect of aggregation rules on ecosystem function for wetland offsets. Cobenefits and trade-offs among functions and the constraints posed by the quantity and quality of restorable sites resulted in a redistribution of functions between affected and offset wetlands. Hydrology and water-purification functions were positively correlated and negatively correlated with biodiversity function. Weighted-average rules did not replace functions in proportion to their weights. Rules prioritizing biodiversity function led to more monofunctional wetlands and landscapes. The minimum rule, for which the wetland score was equal to the worst performing function, promoted multifunctional wetlands and landscapes. The maximum rule, for which the wetland score was equal to the best performing function, promoted monofunctional wetlands and multifunctional landscapes. Because of implicit trade-offs among ecosystem functions, no-net-loss objectives for multiple functions should be constructed within a landscape context. Based on our results, we suggest criteria for the design of aggregation rules for no net loss of ecosystem functions within a landscape context include the concepts of substitutability, cobenefits and trade-offs, landscape constraints, heterogeneity, and the precautionary principle.

more on movement ecology

I’m still digging into movement ecology, which has always fascinated me. Here is a comprehensive recent literature review on the subject.

Trends and missing parts in the study of movement ecology

Movement is important to all organisms, and accordingly it is addressed in a huge number of papers in the literature. Of nearly 26,000 papers referring to movement, an estimated 34% focused on movement by measuring it or testing hypotheses about it. This enormous amount of information is difficult to review and highlights the need to assess the collective completeness of movement studies and identify gaps. We surveyed 1,000 randomly selected papers from 496 journals and compared the facets of movement studied with a suggested framework for movement ecology, consisting of internal state (motivation, physiology), motion and navigation capacities, and external factors (both the physical environment and living organisms), and links among these components. Most studies simply measured and described the movement of organisms without reference to ecological or internal factors, and the most frequently studied part of the framework was the link between external factors and motion capacity. Few studies looked at the effects on movement of navigation capacity, or internal state, and those were mainly from vertebrates. For invertebrates and plants most studies were at the population level, whereas more vertebrate studies were conducted at the individual level. Consideration of only population-level averages promulgates neglect of between-individual variation in movement, potentially hindering the study of factors controlling movement. Terminology was found to be inconsistent among taxa and subdisciplines. The gaps identified in coverage of movement studies highlight research areas that should be addressed to fully understand the ecology of movement.

An idea that has always fascinated me is the idea that when designing a development or an even an entire urban area, you could actually lead with ecology, then layer hydrology, infrastructure, housing, and the other human elements on top of that. Sadly, I don’t think I know a single engineer or urban planner who would be particularly open minded to this idea.

wildlife range in urban areas

Here’s an interesting study finding a general rule across many types of wildlife that their range after urbanization decreases to between one-half and one-third of what it was before urbanization.

Moving in the Anthropocene: Global reductions in terrestrial mammalian movements

Animal movement is fundamental for ecosystem functioning and species survival, yet the effects of the anthropogenic footprint on animal movements have not been estimated across species. Using a unique GPS-tracking database of 803 individuals across 57 species, we found that movements of mammals in areas with a comparatively high human footprint were on average one-half to one-third the extent of their movements in areas with a low human footprint. We attribute this reduction to behavioral changes of individual animals and to the exclusion of species with long-range movements from areas with higher human impact. Global loss of vagility alters a key ecological trait of animals that affects not only population persistence but also ecosystem processes such as predator-prey interactions, nutrient cycling, and disease transmission.

One type of animal included in the study was deer in Pennsylvania. I also learned the name of the academic discipline that studies animal ranges and movements: movement ecology.

living near a forest is good for your amygdala

The amygdala is a part of your brain, and what is good for it is good for you.

“Our results reveal a significant positive association between the coverage of forest and amygdala integrity,” the researchers report. The amygdala is the almond-shaped set of neurons that plays a key role in the processing of emotions, including fear and anxiety.

Perhaps surprisingly, Kuehn and her colleagues found no such association from living close to urban green spaces such as parks, or near bodies of water. Only proximity to forest land had this apparent positive effect…

The study complements the already-strong psychological evidence of the benefits of living close to nature. Previous research has linked access to green space to longer lives, lower levels of aggression, and kids’ cognitive development. One study suggests it even makes for nicer people.

drought and snowpack

At the same time we are experiencing drought and groundwater depletion in populous, food growing regions, there is concern about long-term declines in snowpack. Here are a few papers on the situation – two about the western United States and one about Asia.

Large near-term projected snowpack loss over the western United States

Peak runoff in streams and rivers of the western United States is strongly influenced by melting of accumulated mountain snowpack. A significant decline in this resource has a direct connection to streamflow, with substantial economic and societal impacts. Observations and reanalyses indicate that between the 1980s and 2000s, there was a 10–20% loss in the annual maximum amount of water contained in the region’s snowpack. Here we show that this loss is consistent with results from a large ensemble of climate simulations forced with natural and anthropogenic changes, but is inconsistent with simulations forced by natural changes alone. A further loss of up to 60% is projected within the next 30 years. Uncertainties in loss estimates depend on the size and the rate of response to continued anthropogenic forcing and the magnitude and phasing of internal decadal variability. The projected losses have serious implications for the hydropower, municipal and agricultural sectors in the region.

The twenty-first century Colorado River hot drought and implications for the future

Between 2000 and 2014, annual Colorado River flows averaged 19% below the 1906–1999 average, the worst 15-year drought on record. At least one-sixth to one-half (average at one-third) of this loss is due to unprecedented temperatures (0.9°C above the 1906–1999 average), confirming model-based analysis that continued warming will likely further reduce flows. Whereas it is virtually certain that warming will continue with additional emissions of greenhouse gases to the atmosphere, there has been no observed trend toward greater precipitation in the Colorado Basin, nor are climate models in agreement that there should be a trend. Moreover, there is a significant risk of decadal and multidecadal drought in the coming century, indicating that any increase in mean precipitation will likely be offset during periods of prolonged drought. Recently published estimates of Colorado River flow sensitivity to temperature combined with a large number of recent climate model-based temperature projections indicate that continued business-as-usual warming will drive temperature-induced declines in river flow, conservatively −20% by midcentury and −35% by end-century, with support for losses exceeding −30% at midcentury and −55% at end-century. Precipitation increases may moderate these declines somewhat, but to date no such increases are evident and there is no model agreement on future precipitation changes. These results, combined with the increasing likelihood of prolonged drought in the river basin, suggest that future climate change impacts on the Colorado River flows will be much more serious than currently assumed, especially if substantial reductions in greenhouse gas emissions do not occur.

Changes in seasonal snow water equivalent distribution in High Mountain Asia (1987 to 2009)

Snow meltwaters account for most of the yearly water budgets of many catchments in High Mountain Asia (HMA). We examine trends in snow water equivalent (SWE) using passive microwave data (1987 to 2009). We find an overall decrease in SWE in HMA, despite regions of increased SWE in the Pamir, Kunlun Shan, Eastern Himalaya, and Eastern Tien Shan. Although the average decline in annual SWE across HMA (contributing area, 2641 × 103 km2) is low (average, −0.3%), annual SWE losses conceal distinct seasonal and spatial heterogeneities across the study region. For example, the Tien Shan has seen both strong increases in winter SWE and sharp declines in spring and summer SWE. In the majority of catchments, the most negative SWE trends are found in mid-elevation zones, which often correspond to the regions of highest snow-water storage and are somewhat distinct from glaciated areas. Negative changes in SWE storage in these mid-elevation zones have strong implications for downstream water availability.