Oct 8, 2015


Securityintelligence - Nuclear facilities are now a critical facet of the American utility market. According to the Nuclear Energy Institute (NEI), these facilities generated almost 20 percent of U.S. electricity through 2014, with 99 reactors in use across the country. Owing to the volatile nature of the materials and processes in any nuclear plant, companies have developed excellent physical safety and security procedures.

But as noted by a new Chatham House report based on interviews with 30 industry experts, cybersecurity risk is underestimated at nuclear facilities, leaving these critical producers open to cyberattacks. If plants are breached, what's the fallout?

The Myth of Air

A recent SecurityWeek article discusses some of the reasons for this lack of nuclear cyber readiness. The biggest problem? A belief that air-gapped systems effectively curtail the risk of cyberattack. The idea here is that since potentially vulnerable points such as industrial control system (ICS) software are often isolated from the Internet, there's little chance they could be compromised by attackers. The Chatham report, however, found that many nuclear facilities use technology such as virtual private networks (VPNs) to forge outside-facing connections — and in some cases, plant operators aren't even aware they exist.

Risk assessment is also problematic. The nuclear industry doesn't have a set of unified guidelines for measuring such risk, and the infrequency of cyber incident disclosures often provides a false sense of security. And according to the report, "very few" plants actually take steps — such as installing software patches, for example — to mitigate security risks. What's more, most are reliant on perimeter defenses alone to stop attackers, which hasn't been successful for retail, finance, manufacturing or other energy companies.

Culture also plays a role. As noted by Dark Reading, nuclear operations and IT staff don't always get along. Operations employees are focused on preventing accidental nuclear incidents, while IT professionals focus on curtailing intentional damage. Add in a different set of employment standards for regular staff and IT teams and it's not surprising to see that cybersecurity isn't making headway.

In sum: Nuclear facilities are protected only by myth and miscommunication. And that's a problem.

Please read on at:


Ending Extreme Poverty and current count of the poor

Globally, only about 10 percent of the population lives in extreme poverty; in Africa the number is down to 35 percent. "Technology has absolutely had a leading role," Economist Sachs said. "Nothing has been as important as the mobile phone."

The World Bank uses an updated international poverty line of US $1.90 a day, which incorporates new information on differences in the cost of living across countries (the PPP exchange rates). The new line preserves the real purchasing power of the previous line (of $1.25 a day in 2005 prices) in the world's poorest countries. Using this new line (as well as new country-level data on living standards), the World Bank projects that global poverty will have fallen from 902 million people or 12.8 per cent of the global population in 2012 to 702 million people, or 9.6 per cent of the global population, this year.

Actual poverty data from low income countries come with a considerable lag but the organization, which released the information on the eve of its Annual Meetings in Lima, Peru, based its current projections on the latest available data.

Read more »// Next Big Future

Record 94,610,000 Americans Not in Labor Force; Participation Rate Lowest in 38 Years

(CNSNews.com) - A record 94,610,000 Americans were not in the American labor force last month -- an increase of 579,000 from August -- and the labor force participation rate reached its lowest point in 38 years, with 62.4 percent of the U.S. population either holding a job or actively seeking one.

In other disappointing news, the economy added only 142,000 jobs in September, well below economists' expectations, but the unemployment rate remained at 5.1 percent, where it was in August.

The number of Americans not in the labor force has continued to rise, partly because of retiring baby-boomers and fewer workers entering the workforce.


In September, according to the Labor Department's Bureau of Labor Statistics, the nation's civilian noninstitutional population, consisting of all people 16 or older who were not in the military or an institution, reached 251,325,000. Of those, 156,715,000 participated in the labor force by either holding a job or actively seeking one.

The 156,715,000 who participated in the labor force equaled only 62.4 percent of the 251,325,000 civilian noninstitutional population. The last time the labor force participation was as low as 62.4 percent was in October 1977. (The rate had been 62.6 percent for the 3 months prior to September.)

The participation rate dropped for both men 20 years and older (the 71.3 percent in September is a record low in BLS data going back to 1948). It also dropped for women 16 years and older (56.4 percent participation rate in September compared with 56.7 percent in the two preceeding months).

Last month, 56,647,000 women 16 and older were not in the labor force, an increase of 394,000 from August and up 1,066,000 from September 2014.

That number also rose for men: In September, 32,387,000 men age 20 and older were not in the labor force, up 202,000 from August and an increase of 804,000 from September 2014.

Water content of foods and things, #USGS Water #Science #School

What is the water content of things?

Water is needed to not only grow everything we eat but also to produce almost all the products we use every day. This water is supplied by nature as precipitation or added by people during the growing and production process. You can't tell by the size of a product or the appearance of a food how much water was actually used to produce the item.

Use the form below to enter your guess about how much water is used to produce some common foods and products. Please realize this exercise is meant to give you an estimate of how much water is needed to produce these items. It is very difficult to come up with accurate water-use numbers, and the large variety of food-growing and production techniques used worldwide means that the amount of water needed can vary a huge amount, depending on how and where the food is grown.

Yet another consideration is how far back to go in the chain of production to estimate water use. For beef, some estimates only consider drinking water for cattle, whereas other sources may consider the water needed to grow the food that the cow eats.

The data here were taken from two sources:

Oct 7, 2015

The Decline of ‘Big Soda’ - represents the single largest change in the American diet

Over the last 20 years, sales of full-calorie soda in the United States have plummeted by more than 25 percent. Soda consumption, which rocketed from the 1960s through 1990s, is now experiencing a serious and sustained decline.

Sales are stagnating as a growing number of Americans say they are actively trying to avoid the drinks that have been a mainstay of American culture. Sales of bottled water have shot up, and bottled water is now on track to overtake soda as the largest beverage category in two years, according to at least one industry projection.

The drop in soda consumption represents the single largest change in the American diet in the last decade and is responsible for a substantial reduction in the number of daily calories consumed by the average American child. From 2004 to 2012, children consumed 79 fewer sugar-sweetened beverage calories a day, according to a large government survey, representing a 4 percent cut in calories over all. As total calorie intake has declined, obesity rates among school-age children appear to have leveled off. 

Oct 6, 2015

Oil and gas industry getting hidden subsidies, study says; industry official says data misinterpreted

study by the environmental group Friends of the Earth that focused on the oil and gas industry in North Dakota found that the "royalty-free flaring of natural gas from wells on public and tribal lands amounts to a hidden federal subsidy worth tens of millions of dollars," Phil McKenna reports for InsideClimate News. "But one of the biggest producers of oil in the state, Continental Resources, Inc., challenged the findings, suggesting that the research overstated the volumes of hydrocarbons being burned at wells." Jeff Hume, vice chairman of strategic growth initiatives, told McKenna, "They have obtained flare volume reports which are accurate, [but] what they don't realize is the majority of gas that is reported as flared is inert gas, not hydrocarbons."

The study found that "over a six-year period, the U.S. Bureau of Land Management subsidized the burning of $524 million of natural gas by oil and gas companies operating on public and tribal lands in North Dakota," McKenna writes. "Federal regulations allow oil companies to flare gas without paying royalties if it is the only way they can economically extract oil from a well, Ross said. The companies in the North Dakota study flared 107 billion cubic feet of natural gas from 2007 to 2013, the study found. The carbon dioxide emissions from this were equal to the annual output of more than 1.3 million cars, according to the report. This royalty-free flaring resulted in a $66 million subsidy over the six years of the study for oil and gas companies in North Dakota, the report found."

Oklahoma City-based Continental Resources officials countered that "the study overstated the company's share of flared methane or other hydrocarbons," McKenna writes. Of the 55 billion cubic feet of gas that Friends of the Earth reported as hydrocarbons flared by Continental Resources in North Dakota, Hume said "more than 53.4 of it, or more than 97 percent, was carbon dioxide or nitrogen from enhanced oil recovery operations outside the Bakken formation in Bowman and Slope counties." (Read more

Anonymous insiders reveal real hacking risks to nuclear power plants, report

ComputerworldThe risk of serious cyber-attacks on nuclear power plants is growing, according to a new report by think-tank Chatham House. If you follow this type of news, then this is probably not a big shocker, but did you know there have been around 50 cyberattacks on nuclear plants? 

One unnamed expert quoted in the Chatham report (pdf) claimed, "What people keep saying is 'wait until something big happens, then we'll take it seriously'. But the problem is that we have already had a lot of very big things happen. There have probably been about 50 actual control systems cyber incidents in the nuclear industry so far, but only two or three have been made public." The report claimed that there is limited incident disclosure and a "need to know" mindset that further limits collaboration and information-sharing.  

To read this article in full or to leave a comment, please click here

Towards pills that can mimick many of the benefits of exercise

Everyone knows that exercise improves health, and ongoing research continues to uncover increasingly detailed information on its benefits for metabolism, circulation, and improved functioning of organs such as the heart, brain, and liver. With this knowledge in hand, scientists may be better equipped to develop "exercise pills" that could mimic at least some of the beneficial effects of physical exercise on the body. But a review of current development efforts, publishing October 2 in Trends in Pharmacological Sciences, ponders whether such pills will achieve their potential therapeutic impact, at least in the near future. 

Several laboratories are developing exercise pills, which at this early stage are being tested in animals to primarily target skeletal muscle performance and improve strength and energy use--essentially producing stronger and faster muscles. But of course the benefits of exercise are far greater than its effects on only muscles.

"Clearly people derive many other rewarding experiences from exercise--such as increased cognitive function, bone strength, and improved cardiovascular function," says Laher. "It is unrealistic to expect that exercise pills will fully be able to substitute for physical exercise--at least not in the immediate future."

Figure from an earlier 2013 look at exercise polypill

Physiology online - Exercise is the Real Polypill (2013)

Trends in Pharmacological Sciences - Exercise Pills: At the Starting Line (2015)

Read more » at  Next Big Future

Study Finds Humans Are Worse Than Radiation For Chernobyl Animals via @Slashdot

A study published today in Current Biology shows that wildlife in the Chernobyl exclusion zone is actually more abundant than it was before the disaster. According to the authors, led by Portsmouth University professor of environmental science Jim Smith, the recovery is due to the removal of the single biggest pressure on wildlife—humans. "The wildlife at Chernobyl is very likely better than it was before the accident, not because radiation is good for animals, but because human occupation is much worse," Portsmouth University professor of environmental science Jim Smith says. "We were trying to emphasize that this study is a remarkable illustration of an obvious, but important message," he said. "It is ordinary human habitation and use (farming, forestry, hunting) of land which does most ecological damage."

Read more of this story at Slashdot.

Quantum computing: First two-qubit logic gate in silicon | #tech #Engineering

Andrew Dzurak and his team have built a quantum logic gate in silicon for the first time.

An Australian team of engineers has built a quantum logic gate in silicon for the first time, making calculations between two qubits of information possible – and thereby clearing the final hurdle to making silicon quantum computers a reality. Their work was published online in the international scientific journal, Nature, on 5 October 2015 (London time). 

It's the first time calculations between silicon quantum bits has been demonstrated. To achieve this, the University of New South Wales (UNSW) team constructed a device, known as a 'quantum logic gate', that allows calculations to be performed between two quantum bits, or 'qubits'. The advance completes the physical components needed to realise super powerful silicon quantum computers.

Lead author Menno Veldhorst (left) and project leader Andrew Dzurak (right) in the UNSW laboratory where the experiments were performed. Credit: Paul Henderson-Kelly/UNSW.

Lead author Menno Veldhorst (left) and project leader Andrew Dzurak (right) in the UNSW laboratory where the experiments were performed. Credit: Paul Henderson-Kelly/UNSW.

Any conceivable application, or software program, that would run on a quantum computer is made up of a series of basic one-qubit and two-qubit calculations.

Until now it had not been possible to make two silicon quantum bits "talk" to each other, to perform such "two-qubit" calculations, or "logic gates". The UNSW result means that all of the physical building locks have now been constructed, and so computer engineers can finally begin the task of building a functioning quantum computer in silicon.

Industrial manufacture now possible

A key advantage of the UNSW approach is that they have reconfigured the 'transistors' that are used to define the bits in existing silicon chips, and turned them in qubits.

"Because we use essentially the same device technology as existing computer chips, we believe it will be much easier to manufacture a full-scale processor chip than for any of the leading designs, which rely on more exotic technologies," says Professor Dzurak.

"This makes the building of a quantum computer much more feasible, since it is based on the same manufacturing technology as today's computer industry," he adds.

Dzurak noted that that the team had recently "patented a design for a full-scale quantum computer chip that would allow for millions of our qubits, all doing the types of calculations that we've just experimentally demonstrated."

He said that a key next step for the project is to identify the right industry partners to work with to manufacture the full-scale quantum processor chip.

Please continue reading from: Engineering

Oct 5, 2015

Economics, well this isn't good

Items from Economics news:

ConAgra Foods Lays Off 1,500 (Salt Lake Tribune)

Chesapeake Energy Lays Off 740 (KFOR News Channel 4)

Whole Foods Lays Off 1500 (CNBC)

20 Largest U.S. Layoffs in 2015 (ZeroHedge)

Are Apple and Facebook bad for democracy?

ComputerworldWe're trying to have a democracy here, and ideally an informed one.

Nowadays, however, almost everyone is too distracted with their smartphones to muster the attention span to put up with reading a newspaper or news magazine delivered by a publisher, or even watching TV news.

Instead, we get news through apps and on social networks. The biggest source of apps in the U.S. and the biggest social network are Apple's App Store and Facebook, respectively.

This trend transfers the job of gatekeeper of what political information reaches the public from publications, editors or news directors to the likes of Apple and Facebook -- the companies that choose, in Apple's case, which apps are allowed and which are banned or, in Facebook's case, which news stories or sources are favored by its secret algorithms.

To read this article in full or to leave a comment, please click here

Oct 4, 2015

EPA Lowers the Ozone Ambient Air Standard

On October 1, 2015, the United States Environmental Protection Agency (EPA) lowered the existing ozone National Ambient Air Quality Standard (NAAQS) to 70 parts per billion (ppb) from its current level of 75 ppb. The rule can be found here. This revision, if upheld by the courts, will result in more Wisconsin counties being classified as nonattainment which will, in turn, result in higher regulatory burdens for expanding economic activity within those areas. EPA estimates the cost of the rule at $1.4 billion with concurrent public health benefits between $2.9 billion and $5.9 billion, however industry critics disagree with these values.

By way of background, the Clean Air Act required EPA to set NAAQS for ozone and five other pollutants. NAAQS are set at levels which are deemed protective of human health and the environment with an adequate margin of safety. The Clean Air Act requires EPA to review these standards every five years and update the standards if necessary.
The last ozone NAAQS review process was completed in 2008 and resulted in EPA setting the ozone standard at 75 ppb. On January 19, 2010, EPA proposed lowering the ozone NAAQS to a level between 60 and 70 ppb, with former EPA Administrator Jackson recommending 65 ppb. However on September 2, 2011, President Obama directed EPA to withdraw the proposal. This prompted a lawsuit wherein a federal court judge ordered EPA to finalize its review of the ozone NAAQS by October 1, 2015.

In November 2014, EPA proposed revising the ozone standard within a range of 65 to 70 ppb, but sought comment on lowering the standard to 60 ppb. This proposal was quite controversial since the lower end of the range approaches natural background ozone levels in portions of the western United States.

On October 1, 2015, EPA finalized the new standard at 70 ppb. EPA justified departing from Administrator Jackson's 2010 recommendation of 65 ppb by asserting that EPA now has more data that was unavailable in the past.

Now that EPA has lowered the standard, States must begin the implementation process. EPA issued a memorandum dated October 1, 2015 which provides states and EPA Regional offices with guidance on how to implement the new standard. Within the memo, Acting EPA Air Chief Janet McCabe asserts that EPA will work with states "to carry out the duties of ozone air quality management in a manner that maximizes common sense, flexibility and cost-effectiveness while achieving improved public health expeditiously and abiding by the legal requirements [of the Clean Air Act]."

This memorandum commits EPA to issue new designation guidance in early 2016. Nonetheless, nonattainment designations will likely be due within two years (Fall 2017) and therefore will be based on ozone data collected in calendar years 2014 through 2016. Infrastructure state implementation plans (SIPs) will be due within three years (Fall 2018) and attainment plans within 5 years (Fall 2020). Attainment will be required by 2020 and 2023 for marginal and moderate nonattainment areas, respectively.

Want to know effects on Wisconsin?
Read More from Todd E. Palmer of www.michaelbest.com

Are bladeless turbines the future of wind energy? |

MNN - Mother Nature Network...turbine consists of a fiberglass carbon fiber cone that vibrates when wind hits it. At the base are rings of repelling magnets that pull in the opposite direction to which the wind is pushing. Electricity is then produced via an alternator that harnesses the kinetic energy of the vibrations.

Lower output, but lower costs

Overall, its makers say the Vortex will produce less energy than a conventional turbine (about 30 percent less to be precise), but because you can fit twice as many in any given area, and because the costs are about half that of a traditional turbine, its hoped that the overall impact will be a net positive in terms of ROI, and that's before you take into account benefits like the lower cost of capital making it more accessible for individual installations, or the fact that bird and bat deaths would no longer need to be taken into account when siting such turbines.

As with any new technology, however, it's important not to get too carried away before full-scale field trials prove that the concept is technically and commercially viable. Already, some experts are questioning the assumptions behind The Vortex. In MIT Technology Review's coverage of the company, several wind energy researchers suggested that large-scale applications may run into challenges.

Questions remain

In the aforementioned article, Sheila Widnall, an aeronautics and astronautics professor at MIT, suggested that there's a fundamental qualitative difference between the vorticity produced at small scale, and at low wind speeds, and how wind would behave at higher speeds and with larger turbines:

"With very thin cylinders and very slow velocities you get singing telephone lines, an absolutely pure frequency or tone. [...] But when the cylinder gets very big and wind gets very high, you get a range of frequencies. You won't be able to get as much energy out of it as you want to because the oscillation is fundamentally turbulent."

She also questioned whether the "silent" operation promised by the company would actually turn out to be a reality. The wind itself, when oscillating, will create significant noise in a wind farm made of Vortex's. It would actually sound like a freight train, she suggested.

One of many potential innovations

The Vortex is just one of many different wind energy concepts that are in active development — and whether or not it comes to fruition remains to be seen. One thing is certain: While current wind turbine technology is already beating many experts expectations in terms of how quickly it would scale up, we can safely assume that there is always room for improvement. The fact that engineers, inventors and entrepreneurs across the world are exploring different ways to harness the wind's energy should be an encouraging sign that renewable energy's already bright future is likely to only get brighter.

Please read full and follow at: 

Legionnaires' Bacteria Reemerges In Previously Disinfected Cooling Towers

with the New York Times' unsettling report that 15 water-cooling towers in the Bronx that this week tested positive for Legionnaires' disease  had been disinfected less than two months ago. From the NYT: After an outbreak of the disease killed 12 people in July and August in the South Bronx, the city required every building with cooling towers, a common source of the Legionella bacteria that cause the disease, to be cleaned within two weeks. ... [The] city found this week that bacteria had regrown in at least 15 towers that had been cleaned recently in the Morris Park section of the Bronx. The testing occurred after a fresh outbreak in that area that has killed one person and sickened at least 12, and spurred an order from health officials for the towers to be disinfected again.

Read more of this story at Slashdot.

Oct 2, 2015

Why Canada is banning microbeads ... Spoiler we only have one earth

Public Radio International [feedly] Canada has taken a bold step toward banning them. The microbeads, found in dozens of beauty products widely sold in the US as well, are actually tiny pieces of plastic.

Neutrogena advertises "energizing microbeads" in their foaming scrub.


Steven Davy

Toothpastes, facial scrubs, body lotions, shower gels filled with tiny pieces of polyethylene.

And the real problem created by these plastic beads is where they go after we rinse and spit. Microbeads end up in lakes, rivers and oceans — and that's got Canadian biologist Lisa Erdle worried.

Erdle works for Ontario Streams, a conservation group based in Toronto. She's spent the past few months collecting water samples from the surface of Lake Ontario.

Water samples from Lake Ontario, Canada.

Water samples from Lake Ontario, Canada.


Andrea Crossan

"When we have the net in the water, we are sort of skimming the surface and then collect everything in this really fine mesh," explains Erdle. "[We] then rinse it down and then this gunk, some are a little messier than others is what is collected in the net."

Microbeads are small — like the size of a grain of sand, small. That's the problem. They are too small to be caught by the filters in wastewater plants.

"Most people who are using these face washes, body washes, hand soaps, cosmetics, have no idea that there is plastic in the products that they're using," says Sherri  Mason.

She studies microbeads at the State University of New York at Fredonia. The water samples that Lisa Erdle collects on Lake Ontario go to Mason.

"So [consumers] use these products, they wash their face, they go down the drain and they are not realising that they are actually releasing plastic into the environment."

Mason analyses how much plastic is making its way into in the largest freshwater ecosystem on earth — the Great Lakes.

What she's found is the stuff of nightmares.

New York state alone dumps around 19 tons of microbeads down the drain every year. Once these beads enter the water, they attract toxic substances, like PCBs.

"The concern is that as [the beads] are ingested by organisms that live in the water, they desorb into that organism so you have things like PCBs or PAHs, triclosan," says Mason. "Some of these chemicals that are known to be endocrine disrupters, they are known to be carcinogens. They are known to have very significant human health impacts and basically the plastics act as a means to move those chemicals from the water and into the food web."

And they become part of the food chain.

Products that include microbeads.


Steven Davy

"The smaller the plastic the bigger the impact, so it's the pieces that you can't see that are of most concern to scientists because they are more easily ingested, whether intentionally, like a fish seeing this round plastic might think it's a fish egg and eat it, or unintentionally. The beginning part of the food chain are filter feeders, like mussels. And literally all they do is filter water through their bodies so they are not choosing what they eat so anything that's in the water will end up in these organisms."

The fish eat the microbeads. We eat the fish.

And it's why the Canadian government announced in July that it intends to ban the use of microbeads in personal-care products.

The Canadian government reviewed more than 130 scientific papers and concluded that microbeads should be added to the national list of toxic substances.

"It's important to realize that water connects us all to each other," says Mason. "If it's in the water, it affects everybody whether you live in Bangladesh, Zimbabwe or upstate New York."

There are some states in the US that have passed bills banning microbeads. But the US bans don't go into affect until January 2018. And those bills only ban some microbeads, allowing the use of biodegradable ones.

Those are also plastic, though they can be broken down under laboratory conditions. It's not clear whether the beads would biodegrade at the bottom of a lake.

But there is some reason to be hopeful that microbeads will soon be a thing of the past. A number of large beauty product manufacturers are phasing out the use of microbeads.

Meaning that next summer, it's possible that Lisa Erdle won't be seeing as many of those tiny blue plastic balls on the surface of the water.

"I care about this because I grew up on the lake,' says Erdle. "I swim in Lake Ontario, I sail. I fish as well. So, I want a clean lake."

Do you want to know what products have microbeads in them? Here's a list of the products using beads

Please read full and follow at: 

Fukushima disaster still happening: the World’s Never Seen Anything Like This

Counterpunch - The Fukushima Daiichi Nuclear Power Plant No. 2 nuclear reactor fuel is missing from the core containment vessel. Utilizing cosmic ray muon radiography with nuclear emulsion, researchers from Nagoya University peered inside the reactors at Fukushima. The nuclear fuel in reactor core No. 5 was clearly visible via the muon process. However, at No. 2 reactor, which released a very large amount of radioactive substances coincident with the 2011 explosion, little, if any, signs of nuclear fuel appear in the containment vessel. A serious meltdown is underway.

EPA Acts To Mitigate 44 To 73 Percent Of Acute Pesticide Incidents Among Farmworkers | @ThinkProgress

ThinkProgress...The Environmental Protection Agency (EPA) took a major step this week to protect the thousands of U.S. agricultural workers who are exposed to pesticides every year, many of whom suffer from chronic health effects years after they stop working in the fields.

EPA officials announced that the agency will help mitigate pesticide exposure by updating a two-decade old regulation known as the Worker Protection Standard (WPS).

The finalized revision of the WPS includes increased mandatory training sessions to inform farmworkers on the protections their employers are required to offer them; expanded training to teach workers how to reduce "take-home exposure;" new anti-retaliatory provisions to protect whistleblowers who raise concerns; and "no-entry" application-exclusion zones up to 100 feet surrounding pesticide application equipment to protect workers from pesticide overspray. And, for the first time ever, the revision bars minors under 18 from handling pesticides.

The regulation, which will be phased in over the next two years, will affect agricultural workers and pesticide handlers who work on farms and in forests, nurseries, and greenhouses. Livestock workers are not covered. Once fully implemented, the revised regulation is expected to "avoid or mitigate approximately 44 to 73 percent of annual reported acute WPS-related pesticide incidents," according to the EPA.

Virginia Ruiz, Director of Occupational and Environmental Health at the advocacy groupFarmworker Justice, told ThinkProgress that EPA could have "gone farther in certain areas," but welcomed the revised regulation as "a step in the right direction."

Setting a minimum age for pesticide applicators was especially well-received by farmworker organizations. "Children under 18 — their bodies are still developing and a lot of different systems are still maturing," Ruiz noted. "Exposing them to risk could have lifelong health effects."

"People are too immature at 16 years old to be able to handle pesticides, though many people thought that 18 was too young," Jeannie Economos, Pesticide Safety and Environmental Health Project Coordinator at the grassroots organization Farmworker Association of Florida, told ThinkProgress in a phone interview. "Younger people will think there's no problem, they may not wear personal protective equipment, they might not think it's necessary, or realize what they're doing to their health or other people."

Ruiz agreed that 16- and 17-year-olds don't have "the emotional maturity" to work with pesticides because they may not know how to interact with their bosses when they have health-related questions or concerns. "They might not feel comfortable challenging their supervisors or employers about the potential harm," she said.

World first cold water pollution technology put to the test

A study of the world first Cold Water Curtain at Burrendong Dam, near Wellington in the New South Wales central west, has shown some positive early results.

The $3.4 million curtain was installed last year to reduce the effect of cold water pollution on downstream eco-systems.

The infrastructure surrounds the dam outlet and warmer water is diverted from the surface of the dam down to the outlet.

It means water from the bed of the dam, which can be up to 12 degrees cooler is no longer being released into the Macquarie River downstream of the dam.

The curtain is now the subject of a study by the University of Technology Sydney (UTS) and the New South Wales Department of Primary Industries (DPI).

Simon Mitrovic is the principle research scientist for the DPI and Senior Lecturer in the School of Life Sciences at UTS.

He says early testing, which involves taking temperatures at a number of sites both above and below the dam, indicates the curtain is working.

"It's looking like there's been about a 2 degree improvement since the curtain has been put in place," he said.

"We expect there to be greater improvement in the future as the dam gets deeper so as Burrendong Dam is quite low at the moment, the effectiveness of the curtain isn't as great as it will be when the dam is very full."

He says it is likely to have a positive impact on eco-systems below the dam.

"What I imagine is as temperatures increase we'll see organisms re-emerging and coming back upstream towards the dam."

The project is set to continue for another few years, giving a longer time frame to assess how the curtain is working.

Mr Matrovic says it may lead to similar infrastructure being installed elsewhere.

Please read full and follow at: 

Oct 1, 2015

​Google is going to start mapping air pollution all over California (FYI - @BernHyland)

In July, Google took a step towards mapping yet another piece of our lives: air pollution.

As part of a partnership with Aclima, a startup that designs environmental sensor networks, Google launched a pilot program to outfit its Street View vehicles in Denver, Colorado with sensors that can detect a number of pollutants, including black carbon, nitric oxide, methane, and carbon dioxide.

Now the initiative is going big.

At this week's Clinton Global Initiative meeting, Aclima announced that Google Street View cars will map air pollution throughout California — in Los Angeles, San Francisco, and the Central Valley. The mapping starts immediately.

Aclima says it started with California because the state has some of the worst particulate pollution in the country.

If you live in these areas, brace yourself:
Once the data collection starts, you'll be able to see street-level air quality maps on both Google Maps and Google Earth. You will, in other words, be able to see exactly how polluted your city, neighborhood, and street are at various points in time. No word yet on exactly how often the maps will update.

The data will also be uploaded and shared on Google Earth Engine, where researchers, NGOs, and government officials can use it to study air pollution and its effects on cities.

The Environmental Protection Agency (EPA) has its own environmental sensors, but Google and Aclima can now provide more in-depth pollution monitoring on the hyperlocal level.

And since it will be accessible in Google Maps, non-scientists will actually see the data.

Read full and follow:

​New Regulations on Smog Remain as Divisive as Ever | ozone standard of 70 parts per billion would prevent 325,000 cases of childhood asthma and 1,440 premature deaths. A standard of 65 parts per billion would prevent about a million cases of asthma and 4,300 deaths.

 (nytimes.com) In November, the Obama administration released a draft proposal of an updated ozone regulation, which would lower the current threshold for ozone pollution to 65 to 70 parts per billion. That range is less stringent than the standard of 60 parts per billion sought by environmental groups, but the environmental agency's proposal also sought public comment on a 60 parts-per-billion plan, keeping open the possibility that the final rule could be stricter.

Now, in the final days before the rule's release, industry groups are pushing for a new standard of 70 parts per billion or higher, while health and environmental groups want it as low as 60 parts per billion. Both sides say that every notch on that scale can make a big difference.

"There are significant health benefits as the standard is tightened," Mr. Billings said.

An analysis in the E.P.A.'s draft proposal found that an ozone standard of 70 parts per billion would prevent 325,000 cases of childhood asthma and 1,440 premature deaths. A standard of 65 parts per billion would prevent about a million cases of asthma and 4,300 deaths. And a standard of 60 parts per billion would prevent 1.8 million asthma attacks and 7,900 premature deaths.

But each notch ratchets up the cost to industry as well. A tighter smog standard would require the owners of factories and power plants to install chemical scrubbers and other technology on their smokestacks to remove the chemicals. Scrubbers can cost tens of millions of dollars apiece, and industry groups say that with each degree the standard is tightened, their costs will soar.

"Cutting 90 percent of pollution is one thing, but cutting 95 percent can be double the cost of getting to 90," said Howard J. Feldman, the director of regulatory affairs for the American Petroleum Institute, which lobbies for oil and gas companies.

Ross Eisenberg, a vice president at the manufacturers association, said that even a change of two parts per billion in the standard could make a difference. "At a standard of 68, there are 40 percent more counties in America that would be in noncompliance than there are with a standard of 70," he said. "A lot of counties would be dealing with this for the first time."

A standard of 65 parts per billion, Mr. Eisenberg said, could require the use of pollution control technology that does not exist yet. "That's when you have to start shutting things down," he said.

Read full and subscribe at:

​Senate Floor Vote on TSCA Appears Imminent

BNA - By Anthony Adragna and Ari Natter
Sept. 29 — Legislation overhauling how the U.S. regulates its chemicals for the first time since 1976 could hit the Senate floor in the "next day or so," the office of Sen. Tom Udall (D-N.M.) and several other Senate aides said Sept. 29, after negotiators closed in on an agreement with Sen. Barbara Boxer (D-Calif.) that would allow consideration of the bill to proceed.

Sen. John Cornyn (R-Texas), the Senate's number-two Republican, told Bloomberg BNA Sept. 29 he expects the Senate to "take that up in October," referring to the Frank R. Lautenberg Chemical Safety for the 21st Century Act (S. 697). The bill currently has 56 co-sponsors from across 36 states.

"Senator Udall is very optimistic that we could see the Lautenberg chemical reform bill on the floor in the next day or so thanks to collaborative input from other senators in recent days," Jennifer Talhelm, Udall's spokeswoman, told Bloomberg BNA Sept. 29. "We hope we are very close to passing the bill out of the Senate."

The bill, sponsored by Udall and Sen. David Vitter (R-La.), would update the Toxic Substances Control Act, which governs industrial and other commercial uses of chemicals in the U.S. Supporters of the bill include Dupont, 3M, the Alliance of Automobile Manufacturers, American Chemistry Council, BASF Corp., Consumer Electronics Association, Dow Chemical Co. and the National Association of Manufacturers, among others.

No Official Agreement
Don Stewart, a spokesman for Senate Majority Leader Mitch McConnell (R-Ky.), would not confirm the bill's timing: "We have not announced any timing on the bill. Still." McConnell has previously named the TSCA reform legislation a good candidate for the chamber's consideration and multiple senators predicted it would garner 80 to 85 votes upon reaching the floor.

News of the bill's possible floor consideration comes as two state organizations urged Senate leadership to incorporate further changes to measure in order to better protect the ability of states to manage chemical risks.

A key obstacle to the bill advancing in the Senate has been Boxer, who views the bill as detrimental to chemical safety protections, but a Republican Senate aide with knowledge of the bill said the California Democrat was ready to drop her objections to the bill.

Boxer told Bloomberg BNA she and negotiators were "close" to an agreement that would allow consideration of the bill to proceed but declined to specify what it would look like.

"I'm not going to tell you what I'm negotiating, but it's going very well," Boxer said of the negotiations.

Please read full By Anthony Adragna and Ari Natter

​A new study finds that people today who eat and exercise the same amount as people 20 years ago are still fatter.

A study published recently in the journal Obesity Research & Clinical Practice found that it's harder for adults today to maintain the same weight as those 20 to 30 years ago did, even at the same levels of food intake and exercise.
The authors examined the dietary data of 36,400 Americans between 1971 and 2008 and the physical activity data of 14,419 people between 1988 and 2006. They grouped the data sets together by the amount of food and activity, age, and BMI.

They found a very surprising correlation: A given person, in 2006, eating the same amount of calories, taking in the same quantities of macronutrients like protein and fat, and exercising the same amount as a person of the same age did in 1988 would have a BMI that was about 2.3 points higher. In other words, people today are about 10 percent heavier than people were in the 1980s, even if they follow the exact same diet and exercise plans.
"If you are 25, you'd have to eat even less and exercise more than those older, to prevent gaining weight."

"Our study results suggest that if you are 25, you'd have to eat even less and exercise more than those older, to prevent gaining weight," Jennifer Kuk, a professor of kinesiology and health science at Toronto's York University, said in a statement. "However, it also indicates there may be other specific changes contributing to the rise in obesity beyond just diet and exercise."

.... Kuk and the other study authors think that the microbiomes of Americans might have somehow changed between the 1980s and now. It's well known that some types of gut bacteria make a person more prone to weight gain and obesity. Americans are eating more meat than they were a few decades ago, and many animal products are treated with hormones and antibiotics in order to promote growth. All that meat might be changing gut bacteria in ways that are subtle, at first, but add up over time. Kuk believes the proliferation of artificial sweeteners could also be playing a role.

The fact that the body weights of Americans today are influenced by factors beyond their control is a sign, Kuk says, that society should be kinder to people of all body types.

"There's a huge weight bias against people with obesity," she said. "They're judged as lazy and self-indulgent. That's really not the case. If our research is correct, you need to eat even less and exercise even more" just to be same weight as your parents were at your age.

Please read full and subscribe to:


Sep 30, 2015

PHMSA Publishes Final Rule on Fitness and SOPs for Special Permits and Approvals (HM-233E)

On Sept. 10, the U.S. Department of Transportation's (DOT) Pipeline and Hazardous Materials Safety Administration (PHMSA) published a final rule that incorporates standard operating procedures and criteria for "fitness to perform" in the evaluation of applications for special permits and approvals.  The effective date of this final rule is Nov. 9, 2015. This rulemaking was required under the 2012 adopted reauthorization bill, Moving Ahead for Progress in the 21st Century (MAP-21).

ACA and a broader coalition of industries, had petitioned PHMSA to address the standard operating procedures for determining fitness and the fitness criteria upon which such decisions are made. While PHMSA declined to initiate such a rulemaking, ACA and the coalition successfully advocated for language in the reauthorization bill that would force PHMSA to do so. 

PHMSA's Special Permits program was created to allow companies to transport and package hazardous materials in domestic transport in a manner not specifically authorized by the Hazardous Materials Regulations (HMR). Special Permits authorize, for example, the movement of a new substance or allow the use of a new or unique packaging. Approximately 4,500 special permits are maintained in PHMSAs current database. Many paint companies require Special Permits in order to ship raw materials, finished products or waste materials. ACA and its Transportation and Distribution Committee has spent considerable energy and resources responding to the changing policies and procedures of the Special Permits program over the last several years.

One of real impacts of this final rule is that standard operating procedures for review of applications for Special Permits, Emergency Special Permits and Approvals has now been incorporated into the HMR and is easily accessible by applicants. Prior to this rulemaking, these standard operating procedures were impossible or near-difficult to find on PHMSA website and consequently, the internal process for considering applications and the fitness criteria were not well-understood. PHMSA's internal process is complex and involves several layers of review. This rulemaking did not significantly change PHMSA's process, but for now, it is a bit more transparent and is regulatory language.

The final amendments in HM-233E add the following provisions to the HMR:

  • Section 105.5, revised definitions for "approval" and "special permit" and clarifies who may issue them;
  • Section 107.1, new definitions for "applicant fitness", "fit or fitness", "fitness coordinator," and "insufficient corrective action";
  • Section 107.113/117/709, requires that the Associate Administrator review all applications in conformance with newly adopted standard operating procedures in Appendix A to Part 107;
  • Part 107, Appendix A, incorporates Appendix A into the HMR.  Appendix A are the standard operating procedures that PHMSA has been using to process applications for special permits, emergency special permits and approvals and make decisions about fitness; and
  • Section 171.8, revised definitions for approval and special permit.

While ACA did not file comments on this proposed rule, its concerns were expressed by the Dangerous Good Advisory Council (DGAC). DGAC had requested that PHMSA develop an expedited procedure to accomplish minor changes to a Special Permit, such as company name changes or change of address. PHMSA declined to develop such an expedited process, indicating that making such modifications to a Special Permit is not a significant burden to them, particularly if the application to do so is complete. In addition, PHMSA maintains that there is an added safety benefit in that screening of an application will reveal any profile changes for the applicant. 

Read on:


California’s DTSC Releases Draft Guidance on Safer Consumer Products Regulations’ Alternatives Analysis

On Sept. 24, California's Department of Toxic Substances Control (DTSC) released draft guidance on Stage 1 of its Safer Consumer Products Regulations' (SCP) Alternatives Analysis. The agency is accepting comments on the draft Stage 1 guidance until Oct. 23. 

DTSC will also be conducting webinars on Join one of our webinars on Oct. 7 and Oct. 21, 2015 to discuss the guidance. You may download and comment on the Draft Stage 1 Alternatives Analysis Guide through the Safer Consumer Products Information Management System (CalSAFER).

According to DTSC, the Draft Stage 1 Alternatives Analysis Guide provides useful approaches, methods, resources, tools and examples of how to fulfill SCP's regulatory requirements. The draft of the Alternatives Analysis Guide only covers the first stage Alternatives Analysis required by the SCP regulations. A draft including the second stage Alternatives Analysis is scheduled to be released in the first quarter of 2016.

The two stages of the Alternatives Analysis process are:

First Stage: During the first stage the responsible entity identifies the goal, scope, legal, functional, and performance requirements of the Priority Product and the Chemical of Concern, and uses this information to identify and screen an array of alternatives to consider. When the first stage is completed, the responsible entity documents the analysis findings in a Preliminary Alternatives AnalysisReport, and submits that report along with a Work Plan for completing the Alternatives Analysisto DTSC (see table on page 16 of the Alternatives Analysis Guide for more details).

Second Stage: During the second stage Alternatives Analysis, the responsible entity follows the approved work plan from the first stage Alternatives Analysisto compare the Priority Product with the alternatives still under consideration using all available information for the relevant factors. The second Alternatives Analysisstage contains an in-depth analysis that refines the relevant factors and product function descriptions of the first stage and expands the analysis to consider additional impacts, including life cycle and economic impacts (see table on page 17 of the Alternatives Analysis Guide for more details).

The California Safer Consumer Products Regulations were finalized in October 2013, and DTSC has already taken steps to list and potentially regulate three Priority Products. On March 13, 2014, DTSC proposed the following three Priority Products: 1) spray foam systems containing unreacted diisocyanates; 2) paint and varnish strippers containing methylene chloride; and 3) children's sleeping pads containing chlorinated tris. After a Priority Product has been listed, responsible entities will have 60 days to provide notice to DTSC if they are manufacturing a Priority Product for sale in California, and the responsible entity may then be required to conduct an Alternatives Analysis. After reviewing an industry Alternatives Analysis, DTSC may initiate a regulatory response to restrict, limit, or prohibit the use of the Priority Product or chemical.

In April 2015, DTSC released its Priority Product Work Plan for 2015 through 2017. The work plan outlines the product categories and chemical classes that DTSC will review to develop Priority Products under the SCP Regulations over the next three years to provide a "level of predictability" to entities that may become subject to the regulations. 

The work plan describes seven broad product categories and a list of potential chemicals or chemical classes for consideration under each broad product category. The work plan explains the department's prioritization methodology and its decision to select these particular products and potential chemicals for evaluation. Given the number of products in the work plan, it is highly likely that DTSC will not be able to evaluate all of these products over the course of the next three years.

Under the Building Products category, the work plan designates paint, primers, roof coatings, stains, varnishes, adhesives, sealants, and caulking as potential Priority Products for evaluation. With regard to building materials, DTSC cited concerns about exposure to sensitive subpopulations in the built environment, including workers and children, with a focus on flame retardants and potential impacts on indoor air quality and human health.

The work plan also identifies the following candidate chemicals in building materials for potential regulatory action: brominated or chlorinated organic compounds, isocyanates, metals (e.g., chromium VI), perfluorinated compounds, phthalates, and volatile organic compounds (e.g., formaldehyde, toluene). While DTSC has cited these chemicals found in building products as examples, the department may identify chemicals outside of this list after conducting its evaluations.  

According to the work plan, DTSC plans to announce three new Priority Products in 2015. Then, DTSC plans to raise that goal for 2016 and 2017 to at least five products each year. Before proposing potential Priority Products, DTSC intends to gather additional information on the product categories through workshops, stakeholder outreach, and data call-ins. It is important to note that the work plan does not introduce any regulatory requirements on industry.  

ACA staff has been engaged with its membership and DTSC throughout the development of California's Safer Consumer Products regulatory process, regularly attending workshops and providing comments.

Contact ACA's Tim Serieor Stephen Wieroniey for more information. 



The Massive Oil Plume Beneath Pearl Harbor Isn't New, But It Is Shocking

Pearl Harbor was once known as Oahu's "bread basket" because it was such an important fishing area, teeming with ocean life. But since the construction of the iconic U.S. military base, the pristine harbor has been marred by environmental disaster. 

The 12,600 acres of land and water that make up the Pearl Harbor Naval Complex were added to the Environmental Protection Agency's National Priority List of hazardous waste sites in 1992. This list identified the area as a Superfund site, or one that could harm local people or ecosystems due to hazardous waste. In 1998, the health department had issued an advisory to warn people against eating shellfish and fish caught in Pearl Harbor. 

One of the base's more than 700 documented areas of contamination sits beneath Joint Base Pearl Harbor-Hickam's Halawa-Main Gate. There, bunker fuel and other petroleum products -- some of which the Navy says date back to World War II -- have been leaking from a tank farm and collecting in a large underground plume for decades.

Current estimates put the amount of spilled fuel at around 5 million gallons, or nearly half the volume of the 1989 Exxon Valdez oil spill in Alaska, Hawaii News Now reported earlier this month. The plume is approximately 20 acres, or 15 football fields, in size, according to the Navy. 

The Oil-Sands Glut Is About to Get a Lot Bigger

The last place oil producers want to be when prices plummet to profit-demolishing lows is midstream on a billion-dollar project in one of the costliest parts of the planet to extract crude.

Yet that's exactly where half a dozen oil sands operators from Suncor Energy Inc. to Brion Energy Corp. find themselves with prices for Canadian oil now hovering around $30 a barrel. While all around them projects have been postponed or canceled, their investments were judged too far along when the oil game suddenly moved from offense to defense.

These projects will add at least another 500,000 barrels a day -- roughly a 25 percent increase from Alberta -- to an oversupplied North American market by 2017. For companies stuck spending billions in a downturn, the time required to earn back their investments will lengthen considerably, said Rafi Tahmazian, senior portfolio manager at Canoe Financial LP.

"But the implications of slowing down a project are worse," said Tahmazian, who helps oversee about C$1 billion ($758 million) in energy funds at the Calgary investment firm.

A general rule of thumb says new plants require a West Texas Intermediate price of $80 a barrel to break even. Western Canada Select, a blend of heavy Alberta crude, is currently selling at a discount of about $14 a barrel to the WTI benchmark, which fell 1.5 percent Friday to settle at $46.05 on the New York Mercantile Exchange.
Please read full and follow at - Bloomberg Business