Fracking the way to energy independence
With activists continually calling for more regulation, you would think that the oil and gas industry has been given a pass. In fact, oil and gas operators must meet dozens of state and local licensing and permit requirements that differ from state to state, amounting to a stack of paperwork that easily fills a four-drawer filing cabinet.
For a single operation, a company must meet a range of air, mineral, radiological, waste and water requirements specific to the particular geological, regulatory and business environment in each state. In addition, companies must comply with licensing, permitting, and other regulations at the county and city level. Regulations governing oil and gas operations are structured this way because a one-size-fits-all federal process doesn’t work in a large nation with many differing land and water structures.
Fracking was behind U.S. economic recovery
Despite the many regulations governing oil and gas drilling, there is still a lot of push-back against the industry. Most of this has centered on the practice of hydraulic fracturing (fracking), a process of injecting water, sand and a small amount of chemicals into shale rock formations deep below the surface to unlock oil and gas trapped in the rock. A combination of horizontal drilling and fracking have been responsible for taking the United States from an energy importer to an energy producer and even exporter in the space of less than a decade.
According to the International Energy Agency (IEA), the U.S. is now on track to surpass all Saudi Arabia oil output by 2017. The U.S. oil and gas industry supports 9.6 million jobs and lowers the unemployment rate in any state that produces oil or gas. There are 33 oil-producing states that have some of the lowest unemployment rates in the nation, like North Dakota, which was at 2 percent when the rest of the country was at 8.3 pecent. At North Dakota’s current tax rate, the revenues from the Bakken reserves could net the state $240 billion over the next 40 years. In Pennsylvania, Penn State has documented state and local tax revenues of $1 billion and expects that number to go up to $2 billion by 2020.
Penn State estimated that in Pennsylvania 23,730 jobs had been created in construction trade, 16,581 in retail trade, 14,886 in mining, 12,815 in health and social services, 11,042 in professional services, 9,974 in wholesale trade, and 7,767 in hotel and food services. Penn State’s study further estimated that the Marcellus development has created more than $500 million in added value in the retail trade, finance and insurance, health and social services industries, with another $200 million in transportation, information, administrative services, and hotel and food services industries.
In Pennsylvania, for example, the Chamber of Business and Industry estimated that the oil and gas industry had already invested more than $400 million in road improvements and a new impact fee would be raising an additional $200 million over several years.
The oil and gas boom has also helped U.S. manufacturers start bringing back work that had been sent overseas by providing cheaper energy and feedstock. The Reshoring Initiative, an organization that helps companies assess where to manufacture their products, estimates that 50,000 jobs have returned to the U.S. in the last three years, accounting for 12 percent of the total U.S. manufacturing jobs added since 2010 (per Bureau of Labor statistics). Among those companies bringing at least some of their production back to the U.S. are Whirlpool, Rolls Royce Aerospace, Siemens Gas Turbines, General Electric, Apple, Caterpillar, and Ford.
The facts on fracking
Some environmentalists would like to see fracking banned altogether. To better understand the issues, it is perhaps best to first gain a better understanding of the method:
•Fracking is performed after wells have been drilled down to levels of as much as 7,000 to 15,000 feet. The drinking water aquifer is only 300 feet below the surface.
•Before any mixture of water, chemical, and sand is used to frack the shale, the well is encased in multiple layers of steel and concrete to ensure that the fracking mixture is delivered directly to the shale layers targeted for fracturing, 10,000 to 15,000 feet below the aquifer.
•The fracking mixture, as well as the released oil or natural gas, are then sucked back up through the protected wellbore and filtered for re-use. Unusable byproducts are stored and/or disposed of by deep well injection.
•The fracking mixture is 99.95 percent water and sand and 0.05 percent chemical. Most companies now post the chemical components used in fracking operations, and most have eliminated potentially harmful chemicals in favor of food-grade additives. See www.fracfocus.org for chemical disclosures. Some companies are also testing alternatives that use little to no water.
The controversy over fracking escalated in the last several years after the documentaries “Gasland” and “Gasland II” leveled serious claims of groundwater pollution, complete with vivid scenes of homeowners lighting water on fire as it flowed from their kitchen taps. The documentaries attracted a lot of attention from celebrities and the media, despite the fact that many of the claims and scenes featured in the films have been debunked (see here, here, and here for a few of the articles).
The overwhelming majority of scientific research from 150,000 shale wells in the U.S. has always landed on industry’s side in the debate over the benefits of oil and gas production versus the environmental impact. See the Center for Rural Pennsylvania’s “Impact of Marcellus Gas Drilling on Rural Drinking Water Supplies,” the University of Texas’ “Fact-Based Regulation for Environmental Protection in Shale Gas Development,” the Texas Department of State Health Services “Final Report: Dish, Texas Exposure Investigation,” William L. Ellsworth’s “Injection-Induced Earthquakes,” and Duke University’s “Methane contamination of drinking water accompanying gas-well drilling and hydraulic fracturing,” to name just a few.
Industry adopting greener practices
Nonetheless, industry has responded to concerns with safer and more efficient methods:
•Centralized fracturing – consolidating dozens of well operations in a single site to reduce well pad footprints, improve containment of emissions and handling of flowback products, and increase production
•Recycling water – recycling the majority of wastewater for reuse, operators use less of this precious resource while reducing the space and resources required for containment and disposal of contaminants
•Proper frack monitoring – employing the latest technologies and techniques, developed through the more than 60-year use of fracking
•Closed-loop drilling – re-usable tanks for storing fracking fluids rather than large pits reduce the risk of leakage
•Low or no-water alternatives – using CO2, cyclic steam injection, and other alternatives to reduce or eliminate water used in fracking
Just as the industry is continually looking for better ways to extract oil and gas, it is likewise continually working on safer and greener methods.
And it’s already paying off: The U.S. Environmental Protection Agency found that methane emissions from oil and gas industry operations actually fell 20 percent more from 1990 through 2010 than its previous estimates – even as production of natural gas has risen by 40 percent — thanks to industry’s largely voluntary improvements in pollution controls on pipelines, wells, and other infrastructure.
In addition, CO2 emissions from coal were down 18 percent in early 2012. That was the lowest-first quarter CO2 emissions from coal since 1983 and the lowest for any quarter since April-June 1986.
The U.S. Department of Energy said: “The decline in coal-related emissions is due mainly to utilities using less coal for electricity generation as they burned more low-priced natural gas.”
Now, that’s something worth celebrating as we recognize Earth Day this month.
Why Transmission & Distribution Utilities Need Digital Twins
As with any new technology, Digital twins can create as many questions as answers. There can be a natural resistance, especially among senior utility executives who are used to the old ways and need a compelling case to invest in new ones.
So is digital twin just a fancy name for modelling? And why do many senior leaders and engineers at power transmission & distribution (T&D) companies have a gnawing feeling they should have one? Ultimately it comes down to one key question: is this a trend worth our time and money?
The short answer is yes, if approached intelligently and accounting for utilities’ specific needs. This is no case of runaway hype or an overwrought name for an underwhelming development – digital twin technology can be genuinely transformational if done right. So here are six reasons why in five years no T&D utility will want to be without a digital twin.
1. Smarter Asset Planning
A digital twin is a real-time digital counterpart of a utility’s real-world grid. A proper digital twin – and not just a static 3D model of some adjacent assets – represents the grid in as much detail as possible, is updated in real-time and can be used to model ‘what if’ scenarios to gauge the effects in real life. It is the repository in which to collect and index all network data, from images, to 3D pointclouds, to past reports and analyses.
With that in mind, an obvious use-case for a digital twin is planning upgrades and expansions. For example, if a developer wants to connect a major solar generation asset, what effect might that have on the grid assets, and will they need upgrading or reinforcement? A seasoned engineer can offer an educated prediction if they are familiar with the local assets, their age and their condition – but with a digital twin they can simply model the scenario on the digital twin and find out.
The decision is more likely to be the right one, the utility is less likely to be blindsided by unforeseen complications, and less time and money need be spent visiting the site and validating information.
As the energy transition accelerates, both transmission and distribution (T&D) utilities will receive more connection requests for anything from solar parks to electric vehicle charging infrastructure, to heat pumps and batteries – and all this on top of normal grid upgrade programs. A well-constructed digital twin may come to be an essential tool to keep up with the pace of change.
2. Improved Inspection and Maintenance
Utilities spend enormous amounts of time and money on asset inspection and maintenance – they have to in order to meet their operational and safety responsibilities. In order to make the task more manageable, most utilities try to prioritise the most critical or fragile parts of the network for inspection, based on past inspection data and engineers’ experience. Many are investigating how to better collect, store and analyze data in order to hone this process, with the ultimate goal of predicting where inspections and maintenance are going to be needed before problems arise.
The digital twin is the platform that contextualises this information. Data is tagged to assets in the model, analytics and AI algorithms are applied and suggested interventions are automatically flagged to the human user, who can understand what and where the problem is thanks to the twin. As new data is collected over time, the process only becomes more effective.
3. More Efficient Vegetation Management
Utilities – especially transmission utilities in areas of high wildfire-risk – are in a constant struggle with nature to keep vegetation in-check that surrounds power lines and other assets. Failure risks outages, damage to assets and even a fire threat. A comprehensive digital twin won’t just incorporate the grid assets – a network of powerlines and pylons isolated on an otherwise blank screen – but the immediate surroundings too. This means local houses, roads, waterways and trees.
If the twin is enriched with vegetation data on factors such as the species, growth rate and health of a tree, then the utility can use it to assess the risk from any given twig or branch neighbouring one of its assets, and prioritise and dispatch vegetation management crews accordingly.
And with expansion planning, inspection and maintenance, the value here is less labor-intensive and more cost-effective decision making and planning – essential in an industry of tight margins and constrained resources. What’s more, the value only rises over time as feedback allows the utility to finesse the program.
4. Automated powerline inspection
Remember though, that to be maximally useful, a digital twin must be kept up to date. A larger utility might blanche at the resources required to not just to map and inspect the network once in order to build the twin, but update that twin at regular intervals.
However, digital twins are also an enabling technology for another technological step-change – automated powerline inspection.
Imagine a fleet of sensor-equipped drones empowered to fly the lines almost constantly, returning (automatically) only to recharge their batteries. Not only would such a set-up be far cheaper to operate than a comparable fleet of human inspectors, it could provide far more detail at far more regular intervals, facilitating all the above benefits of better planning, inspection, maintenance and vegetation management. Human inspectors could be reserved for non-routine interventions that really require their hard-earned expertise.
In this scenario, the digital twin provides he ‘map’ by which the drone can plan a route and navigate itself, in conjunction with its sensors.
5. Improved Emergency Modelling and Faster Response
If the worst happens and emergency strikes, such as a wildfire or natural disaster, digital twins can again prove invaluable. The intricate, detailed understanding of the grid, assets and its surroundings that a digital twin gives is an element of order in a chaotic situation, and can guide the utility and emergency services alike in mounting an informed response.
And once again, the digital twin’s facility for ‘what-if’ scenario testing is especially useful for emergency preparedness. If a hurricane strikes at point X, what will be the effect on assets at point Y? If a downed pylon sparks a fire at point A, what residences are nearby and what does an evacuation plan look like?
6. Easier accommodation of external stakeholders
Finally, a digital twin can make lighter work of engaging with external stakeholders. The world doesn’t stand still, and a once blissfully-isolated powerline may suddenly find itself adjacent to a building site for a new building or road.
As well as planning for connection (see point 1), a digital twin takes the pain out of those processes that require interfacing with external stakeholders, such as maintenance contractors, arborists, trimming crews or local government agencies – the digital twin breaks down the silos between these groups and allows them to work from a single version of the truth – in future it could even be used as part of the bid process for contractors.
These six reasons for why digital twins will be indispensable to power T&D utilities are only the tip of the iceberg; the possibilities are endless given the constant advancement of data collection an analysis technology. No doubt these will invite even more questions – and we relish the challenge of answering them.