The Public Lab Blog


stories from the Public Lab community

About the blog | Research | Methods



A message from the Public Lab staff

by warren | over 4 years ago | 0 | 4

Public Lab resists and rejects: racism, sexism, ableism, ageism, homophobia, transphobia, xenophobia, body shaming, religion shaming, education bias, and bullying.

We do not tolerate hatred towards women, people of color, LGBTQ or based on religious belief. We stand by our community and partners and are committed to continuing our work on environmental and health issues affecting people.

Read more Follow

blog


Thoughts on Method 9 and its utility

by gretchengehrke | over 4 years ago | 3 | 4

Several people in our Public Lab community are concerned about various kinds of airborne emissions, and what we as the public can do about them. One of the most accessible methods for assessing emissions is to estimate the opacity of emissions (I’ll explain a bit about this below) using EPA’s Method 9. Some community members have gone through Method 9 training and have found it very useful; others have found that it hasn’t been useful for their situations. I recently went through the training and became certified for Method 9, and I want to share some of the things I learned in that process, and my thoughts about the potential utility of Method 9 for various situations and concerns. Please comment and share your thoughts too!

What is opacity and what is Method 9?

Opacity is the extent to which light is blocked, or the extent to which you can’t see through an emissions plume. Opacity is caused by small particles and gases that absorb, reflect, or refract light. Particles that are similar size to visible light wavelengths (390-700 nm) can scatter visible wavelengths effectively, muting light rather than preferentially reflecting a given color; carbon particles (like soot) effectively absorb light, largely contributing to plume opacity.

Method 9 is a method to standardize the direction and distance from which you observe an emissions plume, with regards to both the plume direction (related to wind direction), sun direction, and stack (or pile) height. The basics of this include that the sun needs to be at your back, you need to be looking perpendicular to the plume, and you should be between “3 stack heights” (or, three times the height of wherever the emission is coming out or off of the source) and a quarter mile from the source. These guidelines can be very tough to follow, however, given potentially limited access to unobstructed views of the emissions source. Method 9 recommends that you monitor emissions in the morning or afternoon -- not midday -- and move if the wind suddenly changes direction. Again, this is easier said than done, especially if you are not directly on the property where emissions are occurring. The training for Method 9 includes a lecture component and field training where you practice estimating the opacity of plumes, training your eyes to discern smoke opacities to ~5% resolution.

Starting as early as 1859 (in the City of New Orleans vs. Lambert case), smoke opacity has been used to regulate air pollution. Today, states regulate plume opacity for point-source emissions (like from smokestacks) and most regulate opacity of fugitive emissions. Common opacity limits are 20% opacity, which means that 20% of light is blocked, or you can only see through the plume to see 80% of the background behind the plume. In practice, 20% opacity is visible, but can be hard to differentiate the bounds of the plume -- it’s really not thick smoke. Method 9 is used to measure opacity and enforce state emissions opacity regulations; if you are certified in Method 9 (which anyone can do), you can report violations and prompt an enforcement response.

What are some of the limitations of the method?

Method 9 can be very useful, but also has many limitations.

  1. First and foremost, Method 9 only allows you to assess visible emissions -- it provides no ability to ascertain the chemical composition of what is being emitted, and is not useful for most vapor emissions.

  2. Steam plumes are a significant complicating factor too, as steam is not subject to opacity rules, and it is often difficult to distinguish whether or not a plume contains steam or not.

  3. The physical restrictions of conducting Method 9 also limit its utility since it is often not possible to view plumes with the specific siting requirements mentioned above.

  4. A significant limitation of Method 9 is that there is no residual evidence of the visible emissions observed, which can limit agency’s ability to enforce violations. It is recommended that people conducting Method 9 also take photographs of the site and the emissions to document what was observed. There is also a digital camera alternative to Method 9, which has its own limitations, and is discussed below.

  5. Another limitation is that persons need to be re-certified every 6 months, with each certification training/test fee ~$200. In some places, this fee is waived for citizens and covered by permit fees for industry, but in most places each person is responsible for paying their certification fee, and can be exclusionary to people who can’t afford that.

What about fugitive emissions?

In some states, like Wisconsin, opacity limits apply to fugitive emissions too. Fugitive emissions are any emissions from a process that are not through a specified emissions point -- they are construction dust plumes, dust kicked up from unpaved roads, wind-blown dust coming off of sand piles, plumes emanating from blasting, etc. Assessing the opacity of fugitive emissions can be complicated since there often isn’t a distinct plume with a distinct direction, but as long as you are looking through the narrowest/shortest dimension of the emissions, and are following the proper siting requirements (i.e. sun at your back, appropriate distance from emissions point), Method 9 assessments are valid for fugitive emissions. Since fugitive emissions are more sporadic and variable than smokestack emissions, it is recommended that you become familiar with the characteristics of those fugitive emissions before starting your monitoring. It is useful to check out your state’s regulations before starting monitoring too, since some states, like Colorado, unfortunately have exemptions from opacity rules for fugitive emissions.

Where can I learn about emissions opacity regulations in my state?

Visible emissions opacity limits are included in each state’s air pollution regulations. Usually these regulations are searchable online in each state’s administrative codes on the state legislature website. Opacity standards are also included in the “State Implementation Plan” (SIP) which the state develops to detail how the state will achieve the National Ambient Air Quality Standards (NAAQS). Legislative websites and SIPs can both be somewhat onerous to navigate, so it may be most efficient to search for “opacity” on your state’s environmental agency (usually a DEP, DEQ, or DNR) website.

What are some similar methods?

There are four other methods recognized by EPA that are similar to Method 9.

  1. Alternative Method 82, also known as ASTM D7520, is the “Digital Camera Opacity Technique” (DCOT) that can be used in place of Method 9 when approved. Note that Alternative Method 82 is only approved to demonstrate compliance (or lack thereof) with federal opacity limits, but not opacity limitations set by the state or municipality. Alternative Method 82 does have the advantage of having a data record of visible emissions, however, it can be very difficult to actually conduct. First, the DCOT system, which includes a digital camera, a photo analysis software platform, and a results assessment and reporting component, needs to be certified, and this certification process is more arduous than that of Method 9. The DCOT operator has to complete a manufacturer-specified training course, and follow all of the Method 9 siting requirements (and a couple of additional limitations), and then the images also generally have to be sent to a third party for analysis. Also, as of today, there is still only one DCOT system that is commercially available and certified to conduct Alternative Method 82, and the software licenses can be thousands of dollars per year. In addition to the cost for the DCOT system and analysis, it also takes longer, and cannot immediately identify opacity violations (it takes processing time). Therefore, while there are some definite advantages of Alternative Method 82 (notably the data record with photographs), there are currently considerable drawbacks. The company whose training I took, AeroMet, recommends that the EPA address these drawbacks to make the method more accessible and feasible for people to actually use.

  2. Methods 203a, 203b, and 203c are alternative methods for Method 9 for slightly different types of opacity limitations: time-averaged, time-exception, and instantaneous limitations, respectively. Each of them are the same general procedure, but in 203a total time assessed can be 2-6 minutes (whereas Method 9 requires 6 minutes), 203b averages the amount of time that emissions are above the opacity limit, and 203c takes observations every 5 seconds for 1 minute (whereas Method 9 takes observations every 15 seconds for 6 minutes). For Methods 203a-c to be acceptable for assessing compliance with air pollution regulations, that must be specified in the state implementation plan.

  3. Method 22 is somewhat similar to Method 9, but is used to assess the frequency of visible emissions, not the opacity of those emissions. Method 22 is mostly used for fugitive emissions and gas flares. In Method 22, the observer uses two stopwatches, one to measure total time elapsed, and one to measure the time when visible emissions are present, to determine the frequency and percentage of time that a source is visibly emitting. For Method 22, the observer can be indoor or outdoor and there are fewer siting requirements overall. If industries are subject to compliance assessed by Method 22, it will be stated in the SIP.

SmokeSchool.jpg

Read more Follow

plume epa air-quality blog


Enforcing Stormwater Permits with Google Street View along the Mystic River

by mathew | almost 5 years ago | 1 | 4

Compressed autos at Mystic River scrap yard, Everett, Massachusetts, 1974. Spencer Grant/Getty Images. CC-NC-SA

In 2015 the New America Foundation asked @Shannon and me to write a chapter for their Drone Primer on the politics of mapping and surveillance. I worked in an example of positive citizen surveillance by the Conservation Law Foundation (CLF) that I’d heard about in a session at the 2015 Public Interest Environmental Law Conference. I’ve excerpted and adapted my writeup of the CLF case as a part of our ongoing Evidence Project series. If you know of similar cases please get in touch!

Geo-tagged aerial and street-level imagery on the web can be a boon to both environmental lawyers and the small teams of regulators tasked by US states with enforcing the Clean Water Act. Flyovers and street patrols through industrial and residential districts can be conducted rapidly and virtually, looking for clues to where the runoff in rivers is coming from. Combining aerial and street-level photographs with searchable public permitting data, the 1972 Clean Water act’s stormwater regulations are now more enforceable in practice than they have ever been (Alsentzer et. al., 2015).

State and federal environmental agencies often do not have time or resources to adequately enforce permits under the National Pollutant Discharge Elimination System (NPDES) that regulates construction and industrial stormwater runoff, and roughly half of facilities violate their stormwater permits every year (Russell and Duhigg, 2009). Enforcement can be picked up by third parties, however, because NPDES permits are public. Plaintiff groups and legal teams conduct third-party enforcement through warnings and lawsuit filings. Legal settlements from lawsuits recoup the plaintiffs legal costs, and can also include fines whose funds are directed towards community-controlled Supplemental Environmental Projects that help improve environmental conditions in the violator’s watershed. The Conservation Law Foundation (CLF), a Boston-base policy and legal non-profit, operates in precisely this manner, recouping their costs through lawsuits and directing funds to Supplemental Environmental Projects in the Mystic River Watershed.

In 2010 a neighborhood group approached the CLF about a scrap metal facility on the Mystic River. Observable runoff demonstrated the facility had never built a stormwater system, and a quick US Environmental Protection Agency (EPA) NPDES permit search revealed that they had never applied for or received a permit. The facility was flying under the EPA’s enforcement radar, and so were four of the facility’s neighbors.

Between 2010 and 2015 CLF’s environmental lawyers initiated 45 noncompliance cases by looking for industrial facilities along waterfronts in Google Street View, and then searching the EPA’s stormwater permit database for the facility’s address. Most complaints are resolved through negotiated settlement agreements, where the facility owner or operator funds Supplemental Environmental Projects for river restoration, public education, and water quality monitoring that can catch other water quality criminals. Together, CLF and a coalition of partners such as the Mystic River Watershed Association, are creating a steady stream of revenue for restoration, education, and engagement in the environmental health of one of America’s earliest industrial waterways.

Regardless of their effect, legal threats are stressful, often expensive, and can take years to resolve. Even when threatened polluters are acting in good faith to clean up their systems, the process of identifying and persuading companies to comply with environmental regulations can be strain relationships in communities. Non-compliant small businesses on the Mystic River that have been in operation since before the Clean Water Act was passed in 1972 may never have been alerted to their obligations under the law. Their absence from the EPA database reflects mutual ignorance from bureaucrats of businesses and businesses of bureaucracy. However, businesses bear the direct costs of installed equipment, staff time, and facility downtime, indirect costs of professional reputation from delayed operations or identification as a polluter, and transactional costs of paying for legal assistance or court fees. Indirect and transactional costs are hidden punishments that can accrue regardless of guilt or readiness to comply.

To combat the negative perceptions that can accrue from the use of legal threats, CLF proactively works to fit itself into a community-centered watershed management strategy. CLF and their partners run public education and outreach campaigns and start with issuing warnings that aren’t court-filed (Alsentzer et. al., 2015). Identifying and working with businesses operating in good faith is a tenet of community-based restoration efforts. By using courts as a last resort and participating in public processes where citizens can express the complexity of their landscape relationships, CLF and their partners are increasing participation in environmental decision-making and establishing the legitimacy of restoration and enforcement decisions.

Regulations and permit databases can often be tough to put to work, but the CLF’s case was fairly straightforward: They simply searched for company’s addresses in a publicly available database. We would love to hear cases of more groups using this approach or other simple modes of regulatory engagement.

Excerpted and Adapted from Mathew Lippincott with Shannon Dosemagen, The Political Geography of Aerial Imaging, 19-27 Drones and Aerial Observation, New America Foundation 2015.

CC-NC-SA

Sources and Further Reading:

Alsentzer, Guy, Zak Griefen, and Jack Tuholske. 2015. CWA Permitting & Impaired Waterways. Panel session at the Public Interest Environmental Law Conference, University of Oregon.

Conservation Law Foundation Newsletter “Coming Clean”, Winter 2014;

D.C. Denison, “Conservation Law Foundation suing alleged polluters”, Boston Globe, May 10, 2012.

Russell, Karl and Charles Duhigg, Clean Water Act Violations are Neglected at a Cost of Suffering. In The New York Times, Sept 12, 2009. Part of the Toxic Waters Series

Read more Follow

evidence epa blog water


What goes into choosing a topic name?

by liz with abdul , Bronwen , cfastie , gretchengehrke , nshapiro , warren | almost 5 years ago | 2 | 3

above: sketch of figuring out how to organize "air" into a research area, and which methods are part of the research area, and which activities would go on what grid...Photo by @nshapiro

We've been having some fun discussions over the past couple months with people on each of the topical lists about what to name the new "top-level" pages where we're organizing. That means -- when posting activities, do they end up on /wiki/balloon-mapping or /wiki/aerial-photography? Do we use the older /wiki/spectrometer page, or the new one at /wiki/spectrometry? But we're hoping for even MOAR discussion!

Let's think about:

  1. where and how these new pages will show up -- most likely on a dropdown menu and maybe eventually on the front page of publiclab.org,
  2. and, the timing -- we're prioritizing the creation of these "origin" pages amidst all the creation of activities and activity grids we've been working on and will continue to work on through Barnraising.

So far we've created drafts of:

Up next:

When naming new pages, some things to consider are that names should be:

Looking ahead, we have more naming to do! There are some mismatched names:

  • "dssk" vs. "desktop-spectrometry-kit-3-0"
  • "infragram" vs. "infrared" vs "multispectral-imaging"
  • "timelapse" vs. the broader" photo-monitoring"

We'd really like to hear from a wide selection of voices about naming! Please pile on in the comments! Thank you!

Read more Follow

blog with:warren with:cfastie with:nshapiro


What makes a good activity?

by warren with gretchengehrke | almost 5 years ago | 13 | 4

In our continuing shift towards using the new Q&A feature and the new Activity grids as a framework for collaboration on PublicLab.org, we're encouraging people to post their work more in the spirit of Instructables.com -- "showing each other how to do something" rather than just telling people about something you've done. This shifts the emphasis from solely documenting what you've done, to helping others do it too. (image above from a Lego Technics kit)

There are several reasons we like this. A how-to guide (what we're calling Activities) must have extremely thorough and easy-to-follow steps (and may need to be revised if people get stuck). Perhaps even more importantly, its success (we hope) can be measured by how many people are able to follow the steps successfully, which exercises and fuels the power of broad communities and open science.

What's needed?

While there are various types of activities for various purposes, all of them ought to set out some basic information to help people get started:

  • a description of the purpose of the activity
  • a list of materials needed
  • a clear description of your conditions (e.g. lighting, temperature, or other relevant factors)
  • a detailed sequence of steps to follow
  • a description of how to confirm you've followed the steps correctly
  • a hypothesis or expected outcome
  • a discussion of your results
  • a list of questions to explore next (unknowns, or followup activities)
  • a request for input (there's always room for improvement!)

Speaking of room for improvement, can folks suggest other important parts of an activity? With an eye toward making it easy for anyone to write and post activities, and for others to replicate them, what's the minimum necessary?

image description

(IKEA Stonehenge. Justin Pollard, John Lloyd, and Stevyn Colgan designed an IKEA manual for Stonehenge, publishing it under the title HËNJ in the QI 'H' Annual)

Drafts welcome

We'd also like to suggest that people post things early -- to share ideas, solicit input, and acknowledge that most posted activities will go through some (if not many) revisions as people try them out and offer feedback. Could we even have a separate "Publish Draft" button so they're clearly marked as such, and people know they're encouraged to share early and often?

Break it up!

One important way we think will increase the chances that people will complete a replication of your activity is to simply write shorter activities -- perhaps breaking up a longer set of steps into several related modules. Instead of posting a long and complex activity, a few shorter ones -- each with a simple way to verify that the steps so far were correctly completed -- are much more accessible, and will tend to separate distinct possible causes of failure for easier troubleshooting.

Distinct modular activities can be linked and referenced to create a larger activity that might span, for example, building and verifying a tool functions properly, tool calibration, and lab or field tests of various materials using the tool. Even if the final activity cannot be completed without the previous activities first, breaking them out into distinct activities that build on each other will help the onboarding process.

image description (above, @cfastie shows how to swap a lens in a #mobius camera)

Supporting activity authors

Finally, beyond this overview, what more can we do to make it easy to write good activities? Some have suggested a kind of "assistance group" who could provide helpful tips and constructive critique to people posting on Public Lab. This sounds like a great idea, and potentially extra helpful to folks who are hesitant or unsure of what makes a good and thorough post.

Would "activity templates" be useful, to the extent that they can be generalized?

We're also, of course, posting some example Activities, such as this spectrometer calibration activity, which we hope will help set some conventions.

Next steps

We're also interested in how people could be introduced to other activities on a topic once they complete the current one -- maybe there's a "sequence" of activities that grow in complexity? Or we could display a mini activity grid of "related activities" at the bottom of each one?

Finally, we're trying to figure out how people can request an activity for something they want to learn to do, but for which there is not yet an activity posted. This'll be especially important as we're starting out, since we have very few complete activities posted -- but it'll also be a great starting place for people hoping to share their knowledge and expertise. Our initial stab at this is to list "limitations and goals" for a given kit, clearly explaining the problem we'd like to solve. This is actually a list of questions using our new questions system -- and we imagine people might post an activity, then link to it as a proposed answer.

We need your input!

This is all quite new, and we'd love to hear other ideas for how this could work. And of course, if you're interested in giving it a try and writing an Activity, please do! Activity grids are going up on many wiki pages across the site, so if you have questions about where and how to post, please leave them in the comments below. Thanks!

Read more Follow

collaboration community leaffest blog


Q&A enables automated FAQ

by liz with abdul , gretchengehrke , warren | almost 5 years ago | 0 | 1

Asking and answering questions is at the very heart of Public Lab. It's how we get started, it's how we make progress, it's how we get to know each other and our environmental concerns. Dedicated readers will recognize that some "getting started" exchanges have been repeated countless times on the mailing lists. (PS To those of you who are high volume question answerers -- everyone is endlessly grateful for your responses!) While it's critical that that questions from newcomers, however repetitive, will always be welcome, generating a Frequently Asked Questions (FAQ) grid will lower the barrier to exchanging information.


There are two parts to the new automated FAQ system:

1) The new Question and Answer system that @Ananyo2012 built into the plots2 codebase this summer is up and running.
See it here: https://publiclab.org/questions And read more about it here: https://publiclab.org/wiki/public-lab-q-and-a

Screen_Shot_2016-09-07_at_12.18.18_PM.png

2) The FAQ Grid is a variation of the Activity Grid insofar as it's also generated by a powertag, and sorted by Likes.

FAQs will be on every "top-level" research page, see it here https://publiclab.org/wiki/spectrometry#Frequently+Asked+Questions

Screen_Shot_2016-09-07_at_12.14.07_PM.png

You can add an automated FAQ grid to any wiki page by using this code:

the title:
## Frequently Asked Questions

the button where people can ask a new question:

<a class="btn btn-primary" href="/post?tags=question:spectrometry&template=question">Ask a question about spectrometry</a>

the grid itself:

[notes:question:spectrometry]

Screen_Shot_2016-09-07_at_12.15.06_PM.png


As @mathew reported back from Write The Docs, pruning an automated system of FAQs is superior to curating a manual one. Further, linking product support directly to documentation is so important that the Kits Initiative will move their knowledge base onto the Q&A, and will interact with customers using Q&A.

Early adopters on method specific mailing lists might consider subscribing to the relavant question:foo tag on the website. (pssst this is the start of a medium term plan to move all mailing list interactions onto the website). For instance, spectrometry list members might want to subscribe here: https://publiclab.org/tag/question:spectrometry

Screen_Shot_2016-09-07_at_12.17.26_PM.png

Please write in with ideas and new suggestions! What do you think?

Read more Follow

community blog with:warren with:gretchengehrke