非常规/复杂油藏

数据之旅:数字化项目为运营商带来回报

行业领导者正在利用数据的力量来提高效率、消除非生产时间并减少资本支出。

在彩色背景中使用未来触摸屏电脑的女性
盖蒂图片社。

世界经济论坛创始人兼执行主席克劳斯·施瓦布 (Klaus Schwab) 宣称,数据是第四次工业革命的支柱。这场 21 世纪的革命是一场关于连接、高级分析、自动化和制造技术的革命。

虽然这场高科技革命的道路很早以前就开始使用水和蒸汽为机械化生产提供动力,但正是电力为大规模生产提供动力的能力将我们从第一次工业革命带入了第二次工业革命。

自动化生产所必需的电子信息系统迎来了第三次革命。数据以及我们从这些看似随机且不同的数字和字母模式中破译和提取价值以创建可操作情报的能力正在迅速改变业务开展方式。

数据无处不在、无处不在。但从宏观的时间尺度来看,数字数据的收集、分析和应用还相对年轻。

例如,比利·比恩 (Billy Bean) 和美国职业棒球大联盟球队奥克兰运动队 (Oakland Athletics) 在 2000 年代初期展示了数据驱动决策的重要性,他们利用数据分析的力量从垫底竞争者转变为追逐冠军,这就是所谓的“数据驱动决策”。点球成金。

大约 20 年后,数据和数字的力量继续发展,例如,卡塔尔卢塞尔体育场的明亮灯光在上个月的 FIFA 世界杯决赛中闪耀;Al Rihla 的每一个动作都会被捕捉、存储和分析。

但阿尔·里赫拉并不是一名球员;他是一名球员。它是阿迪达斯制造的足球,在整个锦标赛期间举行的 60 多场比赛中每场都使用它,带有内部传感器,具有实时球跟踪功能。足球场周围的高速摄像机和天线提供了额外的支持,将数据输入国际足联增强的足球情报服务。

比赛期间收集的数据会根据控球时间、断线、球压力和强迫失误等各种指标进行分析,然后与球队共享,以改善比赛及其球员。

赢得这场“漂亮的比赛”并不像看起来那么简单,因为在 90 分钟内将球踢入对方球门并尽可能多地进球还需要更多努力。这些数据提供了成功或失败过程的内部观察。

石油和天然气的生产在纸上解释起来也很简单:在地下钻一个洞,将流到地表的物质输送到管道中,然后在将资源输送到市场时所投入的努力获得报酬。价格合适。

实际情况比这更复杂,数据在提高产量等方面发挥着重要作用。

数据推动了席卷美国石油和天然气行业的页岩风。Patterson-UTI Drilling 钻井性能副总裁 David Millwee 认为数据有助于实现钻井横向进尺逐年增加。

“在新墨西哥州的特拉华盆地,2014 年平均每台钻机将为操作员提供约 43,000 至 45,000 横向英尺的生产进尺。2016 年,同一台钻机每年横向移动 63,000 英尺。他说,这基本上是两年内增长了 50%。他补充说,在 2019 年,就在市场崩溃之前,该钻机输送了 93,000 至 95,000 横向英尺。

“当我们查看 2021 年底的数字时,该钻机在 2021 年交付了约 150,000 横向英尺。与去年同期相比,回到 2014 年,当时美国在 43,000 横向英尺上运行了 1,800 台钻机,今天,我们使用的是 700 年代中期的钻机,交付率几乎是这个速度的四倍。可以说,与 2014 年相比,我们今天的钻机数量约为 2,800 台,”他说。

米尔威表示,通过访问这些数据,公司可以确定哪些领域存在限制或效率低下,从而无法实现最佳绩效。

“然后我们就能够回到我们的客户身边,与他们合作,不断推动改进,”他说。“访问数据并确保我们查看正确的信息,有助于确定正确的攻击目标,然后测量结果并重复该过程。这是一个永无止境的持续改进循环。”

挑战依然存在,对手和零部件都疲惫不堪,胜利属于比对方坚持得更久或在市场上获得最高价格的团队。

之前的每一次工业革命都是通过石油和天然气取代煤炭作为发电的主要燃料来源而实现的。连接北美棒球、世界杯足球、石油和天然气行业的一个共同点是各自捕获和存储的大量数据。

石油和天然气行业在其所有领域都采取了重大步骤,虽然其中一些步骤是由超级巨头采取的并且更为人所熟知,但该行业在数据之旅上的进步是为所有人提供空间的,无论公司规模如何。

小型运营商简化复杂的工作

将勘探与生产公司想象成一支职业运动队,其球员就是油井。从准备钻井和完井的新手到生产中的活跃参与者,或第二轮或第三轮人工举升的经验丰富的老兵,每口井都带有大量的数据包袱,需要一个家和一个交付回报的计划。

将来自世界石油和天然气田的大量数据投入使用是该行业正在应对的一项重大挑战。

即使对于最大的运营商来说,这也不是一个简单的过程,它也代表了公司开展业务方式的根本性变化,Datagration 董事长兼首席执行官 Peter Bernard 表示,如果公司要发展,这是必要的。

PetroVisor 是 Datagration 的专利软件,它可以复制多个来源的数据,并将其汇集到一个统一的数据模型下,以帮助公司做出经济和运营决策。该公司表示,PetroVisor 用户在组织数据上花费的时间减少了 70%。

“这导致整个项目时间显着缩短,资本和运营效率提高,单位支出产出更高,”伯纳德说。“公司必须敏捷,能够以最少的资源实现卓越的财务业绩才能生存。数字化是关键。”

德克萨斯州的两家私营运营商——Steward Energy 和 Texas American Resources Company——目前正在其运营中实施 PetroVisor,并在整个过程中取得了早期成功。

总部位于德克萨斯州弗里斯科的 Steward Energy 是一家独立勘探与生产公司,在德克萨斯州和新墨西哥州的二叠纪盆地拥有超过 70,000 英亩的土地,特别关注西北大陆架的常规油田。

该公司于 2022 年 1 月开始与 Datagration 合作实施。Steward Energy 目前拥有 199 口生产水平井、1 口生产直井和 20 口盐水处理井。

Steward Energy 首席开发官斯科特·斯特德曼 (Scott Stedman) 表示,该公司领导层意识到该公司非常擅长收集数据,但在分析数据以使公司受益方面还有改进的空间。

“我们曾多次努力从竣工数据中得出结论,但这些努力都是基础性的,并没有产生任何结果。关于人工举升,将 ESP 转换为杆式泵的时间对我们来说至关重要,因为这可以大幅减少费用并降低未来拉升的成本。迄今为止,这是通过工程监视来确定的,”Stedman 说,并补充说他们正在与 Datagration 合作来实现这一过程的自动化。

“我们还正在最终确定公司绩效仪表板,该仪表板可在一个地方显示公司层面的生产费用和更多信息。这项工作是在使用不同方法之前完成的,我们只需要在数据集之间来回移动,”他说。

Stedman 表示,该公司正在最终确定 PetroVisor 的实施,但他们在临时项目方面已经取得了早期成功。

“该平台使用的技术非常有帮助,而我们自己无法复制。值得注意的是,我们必须向第三方中游合作伙伴提供天然气预测估计,以获得大型天然气工厂扩建的批准,”他解释道。

“etroVisor 能够提取我们所有的生产数据并创建特定区域的天然气预测。这比以前使用估计的气油比和油类型曲线体积进行预测的方法要精确得多。由于我们是一个溶解气驱油藏,因此天然气预测可能非常困难,因为不同类型的井(偏移或非偏移)的产气率差异很大。”

总部位于德克萨斯州奥斯汀的独立勘探与生产公司 Texas American Resources Company 拥有 Eagle Ford 页岩和 Austin Chalk 产区的土地,在阿塔斯科萨县、弗里奥县和拉萨尔县拥有 219 口石油生产井和 1 口天然气生产井。

该公司创始人兼首席执行官 David Honeycutt 表示,PetroVisor 的实施应于 2023 年第一季度完成。此次推出正值德克萨斯美国公司也一直在升级其生产基础设施并实现运营电气化。

“etroVisor 对我们的 ESG 非常有帮助,可以切实地展示我们大型电气化项目的前后效益,”Honeycutt 说。“环境足迹会更好,功能也会更好。”

Honeycutt 表示,该公司使用多个软件包,团队在该平台上根据自己的学科开展工作。有了 PetroVisor,将不再有孤岛。

“所有数据都将被收集,我们将能够从中提取数据。以前,我们需要依赖该学科的人员来完成他们的工作,然后才能看到工作产品。现在我们可以从这些其他平台的数据中选择一个集成的工作流程。您可以查看生产的整体性能,并可以实时运行诊断,以减少运营中的停机时间百分比。”

数字孪生提供内部视图

数据分析可以帮助勘探与生产公司更好地了解其所寻找的油藏地层,同时最大限度地提高产量。数据也是监控设施和设备性能、标记可能导致系统损坏的异常情况的关键。

Hess Corp. 是利用数据分析来加强其勘探与生产工作的早期采用者。2014年至2016年,该公司通过使用专有算法、预测分析、机器学习和自动化,显着优化了巴肯页岩的钻井计划,提高了运营效率,并将现金运营成本降低了32% ,根据 2017 年的一份报告。

Hess 应用数据分析高级顾问 Dryonis Pertuso 表示,该公司致力于利用数字技术来增强和改进其业务开展方式。他认为数字孪生(旨在准确反映物理对象的虚拟模型)的使用是公司在数字化旅程中迈向未来状态的关键推动力。

与 Hess 的工艺工程顾问 Paul Clare 以及 Kongsberg Digital 的团队合作,开发并实施了该公司位于墨西哥湾的 Stampede 工厂的动态数字孪生,作为试点项目的一部分,以实现工艺设备状态监测在关键的上部设备上。

Stampede 设施位于墨西哥湾路易斯安那州富尔雄以南 115 英里处,上部设施总处理能力为每天 80,000 BOPD 和每天 100,000 桶注水能力。

踩踏状态监测架构图。
踩踏状况监测架构图。
资料来源:SPE 210106。

“我们认为,从高层次上看,我们的监控、运营和优化运营方式有很大的改进潜力,”佩尔图索说。“我们选择进行试点的资产的方式是双向的。我们与愿意与我们合作的团队合作。Paul 和团队对这个项目持开放态度,因为他们知道数字孪生可以为他们做什么。”

佩尔图索等人。在2022 年 SPE 年度技术会议暨展览会上的演讲中指出,现有的两个 Stampede 多用途动态仿真模型被重新用作实时性能监控模型。

“这些模型用于操作员培训,并与真实的控制室系统相连,”克莱尔解释道。“我们非常有信心,这些动态模拟很好地代表了我们现在得到的实际计划。这个数字孪生的关键是实时进行动态模拟。它从现场接收数据并不断更新模型以反映实时情况。”

克莱尔补充说,数字孪生的美妙之处在于它可以用作基准。

“如果您查看设施中的压力和温度速率,您可能会质疑它们是否良好。数字孪生提供了这个基准来看看它是什么样子,如果在现实生活中,你没有看到这些东西,那么可能存在需要检查的问题,”他说。

克莱尔将状态监测描述为一个持续的过程,同时具有预防性和反应性维护需求。

“如果出现问题,我们会修复它。通过预防性维护,例如,如果我每 12 个月更换一次过滤器,那么从统计数据来看,故障会更少,”克莱尔说。“我们想要实现的是预测性维护,来自现场的所有数据让我们了解设备的健康状况。我们试图做的是在故障变得太严重之前预测故障,因此我们发现,与完全更换设备相比,需要进行少量维护。”

克莱尔说,“现在所做的一切都是来自外部”。我们正在检查润滑油油位、检查温度、检查振动。数字孪生使我们能够深入了解流程,并从流程的角度了解泵或压缩机如何实时运行。”

该试点项目在 Stampede 设施上安装的增压气体压缩机 (BGC) 上实施了数据分析方法。Pertuso 在SPE 210106中写道,该设施两年的历史数据用于估计剩余使用寿命并检测导致系统损坏的异常操作条件,最终需要更换或导致关闭

该团队测试了动态数字双胞胎,方法是回溯过去,让双胞胎能够在已知的计划外故障发生前几个月开始使用过程传感器数据。然后他们评估了多久可以通过双胞胎发现故障迹象,并评估是否可以采取一些措施来避免这些问题。

Pertuso 表示,在维护人员检测到排气冷却器缺陷之前大约 6 天,该团队能够检测到 BGC 的异常运行状况,并补充说,“如果能够避免气体压缩系统损坏,是可以避免的”我们可以检测 BGC 何时在异常运行条件下运行,并防止早期更换昂贵的设备。”

该项目的目标是将实时预测维护和可靠性工作扩展到整个公司。

” 数字双胞胎对我们来说是一段旅程。我们正在努力创建一个参与系统,为我们的运营团队提供更好的工具,更好地了解情况,并做出更好的决策,”佩尔图索说。“这需要时间,其他资产也会在这段旅程中学习,然后开始犯自己的错误,而不是重复我们所犯的错误。”

供进一步阅读

SPE 112221 从数据监控到性能监控 作者:Michael Stundner、Schlumberger 等人。

SPE 181683 部署通用专家系统,在不断变化的经济条件下自动对运营业务机会进行排名, 作者:Michael Stundner,myr:conn Solutions GmbH 等人。

SPE 203022 MS 自动化井组合优化中东成熟油气田, 作者:Nagaraju Reddicharla、ADNOC Onshore 等人。

SPE 210106 Stampede Digital Twin:过程设备状态监测的先进解决方案, 作者:Dryonis Pertuso、Hess 等人。

原文链接/jpt
Unconventional/complex reservoirs

Data Journey: Digitalization Projects Deliver Returns for Operators

Industry leaders are harnessing the power of data to improve efficiencies, eliminate nonproductive time, and reduce Capex.

Woman using futuristic touch screen computer on a colorful background
Getty Images.

Data are the backbone of what Klaus Schwab—the founder and executive chairman of the World Economic Forum—proclaimed to be the Fourth Industrial Revolution. This 21st‑century revolution is one of connectivity, advanced analytics, automation, and manufacturing technology.

While the road to this high-tech revolution started long ago with the use of water and steam to power mechanized production, it was electricity’s ability to power mass production that delivered us from the first to the second industrial revolution.

Electronics and information systems necessary to automate production ushered in the third revolution. The data and our abilities to decipher and extract value from these seemingly random and disparate patterns of numbers and letters to create actionable intelligence is rapidly transforming the way business is conducted.

Data are everywhere and in everything. But on the grand scale of time, the collection, analysis, and application of digital data is relatively young.

Billy Bean and the Major League Baseball team Oakland Athletics, for example, demonstrated the importance of data-driven decision making in the early 2000s by leveraging the power of data analytics to go from last place contender to chasing the championship in what has become known as Moneyball.

Some 20 years later, the power of data and digital continues to evolve, with the bright lights of Qatar’s Lusail Stadium, for example, shining down last month on the final football match of the FIFA World Cup; every movement made by Al Rihla was captured, stored, and analyzed.

But Al Rihla was not a player; it was the football created by Adidas and used in each of the more than 60 matches held throughout the tournament with an internal sensor for real-time ball tracking capabilities. High-speed cameras and antennas surrounding the football pitch provided additional support, feeding data into FIFA’s enhanced football intelligence service.

The data collected during the matches were analyzed on a variety of metrics like possession time, line breaks, pressure on the ball, and forced turnovers, and then shared with teams with the goal of improving the game and its players.

Winning the “beautiful game” is not as simple as it looks, as there’s more to kicking the ball into the opposing team’s net to score a goal as many times as possible in 90 minutes. These data offer an inside look into the processes of success or failure.

Simple, too, is the production of oil and gas when explained on paper: drill a hole into the ground, route what flows to the surface into a pipeline, and then get paid for the effort invested in delivering that resource to the market when the price is right.

It is more complicated than that, and data are playing an important role in delivering production increases and more.

Data powered the shale gale that swept across the US oil and gas industry. David Millwee, vice president of drilling performance for Patterson-UTI Drilling, credits data for helping to deliver the yearly increases in the lateral footage drilled.

“In the Delaware Basin in New Mexico, an average rig in 2014 would deliver about 43,000 to 45,000 lateral feet of production footage to an operator. In 2016, that same rig was doing 63,000 lateral feet per year. It’s basically a 50% increase in 2 years,” he said, adding that in 2019, right before the market crash, that rig was delivering 93,000 to 95,000 lateral feet.

“When we looked at the numbers at the end of 2021, that rig was delivering about 150,000 lateral feet in 2021. Looking at that year-over-year, back to 2014 when the US was running 1,800 rigs at 43,000 lateral feet, to today where we’re in the mid-700s rigs, delivering almost four times that rate. You could say we have a rig count today around 2,800 as compared to 2014,” he said.

Having access to the data, Millwee said, allowed the company to identify areas in which limitations or inefficiencies were preventing optimal performance.

“We were able to then go back to our customers and work with them to continuously drive improvement,” he said. “Having access to the data, and making sure we’re looking at the right information, helped to identify the right targets to attack, then measure the results and repeat the process. It’s a continuous improvement loop that will never stop.”

Challenges persist, opponents and components tire out, and victory goes to the team that outlasted the other or got the highest price at the market.

Each of the preceding industrial revolutions were made possible by oil and gas with its replacement of coal as a primary fuel source for power generation. A common thread connecting North America’s baseball to the World Cup’s football to the oil and gas industry are the massive quantities of data each capture and store.

There have been significant steps taken by the oil and gas industry across all its sectors, and while some of those steps were made by the supermajors and are more widely known, the advancement of the industry on its data journey is one with room for all, regardless of company size.

Small Operators Simplifying the Complex

Think of an E&P company like a professional sports team, its players are the wells. From rookies ready for drilling and completion to active players on production, or seasoned veterans on their second or third round of artificial lift, each well comes with copious amounts of data baggage needing a home and a plan to deliver returns.

Putting the immense amounts of data coming out of the world’s oil and gas fields to work is a significant challenge that the industry is tackling.

It is not a simple process for even the largest of operators and it also represents a fundamental change in how companies do business, which Datagration Chairman and Chief Executive Officer Peter Bernard said is necessary if a company is to evolve.

PetroVisor, Datagration’s patented software, copies data from multiple sources and brings it together under one unified data model to help companies make economic and operational decisions. PetroVisor users have seen up to 70% reduction in time spent organizing data, the company said.

“This led to significant decreases in overall project time, increased capital and operating efficiencies, and higher production per dollar spent,” Bernard said. “Companies must be agile and able to generate superior financial performance with minimal resources to survive. Digitalization is key.”

A pair of privately held Texas operators—Steward Energy and Texas American Resources Company—are currently implementing PetroVisor into their operations and have realized early successes throughout the process.

Frisco, Texas-based Steward Energy is an independent E&P company holding more than 70,000 acres in the Permian Basin of Texas and New Mexico, with a particular focus on conventional oil fields in the Northwest Shelf.

The company started working with Datagration on the implementation in January 2022. Steward Energy currently has 199 producing horizontal wells, 1 producing vertical well, and 20 saltwater disposal wells.

Scott Stedman, chief development officer for Steward Energy, said that the company’s leaders realized it is very good at collecting data, but there was room for improvement when it came to analyzing the data to the company’s benefit.

“We made several efforts at drawing conclusions from our completion data, but those efforts were basic and did not yield any results. Regarding artificial lift, the time to ESP conversion to rod pump is critical to us, as it is a large expense reduction and reduced cost of future pulls. To date, this is determined by engineering surveillance,” Stedman said, adding that they are working with Datagration to automate this process.

“We are also finalizing a company performance dashboard showing production expenses and more on a corporate level, all in one place. This work was not done before using a different method, we just had to go back and forth between data sets,” he said.

According to Stedman, the company is finalizing implementation of PetroVisor, but they have had early success with ad hoc projects.

“The platform has been very helpful using techniques that we would not have been able to duplicate on our own. Notably, we had to deliver gas forecast estimates to our third-party midstream partner to gain approval for a large gas plant expansion,” he explained.

“PetroVisor was able to pull in all our production data and create area-specific gas forecasts. This was much more precise than previous methods using estimated gas-to-oil ratio and oil type curve volumes to develop a forecast. As we are a solution gas drive reservoir, gas forecasts can be very difficult as different types of wells (offset or not) produce gas at vastly different rates.”

Austin, Texas-based independent E&P company Texas American Resources Company holds acreage across the Eagle Ford Shale and Austin Chalk producing regions, with 219 oil‑ and 1 gas‑producing wells in Atascosa, Frio, and La Salle counties.

David Honeycutt, founder and chief executive of the company, said that the implementation of PetroVisor into operations should be complete by the first quarter of 2023. The rollout comes at a time when Texas American also has been upgrading its production infrastructure and electrifying its operations.

“PetroVisor is going to be really helpful for us with ESG, to tangibly demonstrate the before and the after benefits of our massive electrification project,” Honeycutt said. “The environmental footprint will be better, and so too will the functionality.”

Honeycutt said that the company uses multiple software packages, with the team doing the work in the platform for their discipline. With PetroVisor, there will no longer be siloes.

“All of the data will be collected, and we’ll be able to draw from them all. Before, we were dependent on the people in that discipline to do their work before we could see the work product. Now we can pick an integrated workflow from the data off these other platforms. You can see the overall performance of your production and can run diagnostics in real time to reduce the percentage of downtime in your operations.”

Digital Twin Delivers an Inside View

Data analytics can help an E&P company better understand the reservoir formations it pursues while maximizing production. Data also are key to monitoring facility and equipment performance, flagging anomalous conditions that could result in system damage.

Hess Corp. is an early adopter of the use of data analytics to enhance its E&P efforts. From 2014 to 2016, the company, through its use of proprietary algorithms, predictive analytics, machine learning, and automation, significantly optimized its drilling programs in the Bakken Shale, increased its operational efficiencies, and delivered a reduction in cash operating costs by 32%, according to a 2017 presentation.

Dryonis Pertuso, senior advisor of applied data analytics for Hess, said the company is committed to using digital technologies to enhance and improve the way it conducts its business. He sees the use of digital twins—a virtual model designed to accurately reflect a physical object—as a key enabler for the future state the company is moving towards on its digital journey.

Working with Paul Clare, process engineering advisor for Hess, and a team from Kongsberg Digital, a dynamic digital twin of the company’s Stampede facility in the Gulf of Mexico was developed and implemented as part of a pilot project to enable process equipment condition monitoring on key topsides equipment.

The Stampede facility is located 115 miles south of Fourchon, Louisiana, in the Gulf of Mexico, and has a gross topsides processing capacity of 80,000 BOPD and 100,000 bbl of water-injection capacity per day.

Stampede condition monitoring architecture diagram.
Stampede condition-monitoring architecture diagram.
Source: SPE 210106.

“We think, at a high level, that there is a lot of potential to improve the way we monitor, operate, and optimize our operations,” said Pertuso. “The way we select assets to conduct pilots on is a two-way street. We partner with teams that are open to working with us. Paul and the team were open to this project because they know what digital twins can do for them.”

Pertuso et al. noted in their presentation at the 2022 SPE Annual Technical Conference and Exhibition that the two existing Stampede multi-purpose dynamic simulation models were repurposed as a real-time performance monitoring model.

“The models are used for operator training, and are linked to the real control room system,” Clare explained. “We had a high confidence that these dynamic simulations were a pretty good representation of the actual plan that we’ve got now. The key thing on this digital twin is making this dynamic simulation in real time. It is receiving data from the field and constantly updating the model to reflect real-time conditions.”

Clare added that the beauty of a digital twin is that it can be used as a benchmark.

“If you’re looking at the pressure and temperature rates at your facility, you may question if they’re good or not. The digital twin offers that benchmark to see what it looks like, and if in real life, you’re not seeing these things, then there might be a problem to check on,” he said.

Condition monitoring is what Clare described as an ongoing process with both preventative and reactive maintenance needs.

“If something breaks, then we fix it. With preventative maintenance, if I change a filter every 12 months, for example, then statistically, I will have fewer breakdowns,” said Clare. “Where we want to get to is predictive maintenance, where all of the data from the field gives us a view of how healthy the equipment is. What we’re trying to do is predict a failure before it becomes too bad, so what we find are minor maintenance needs versus a complete changeout of a piece of equipment.”

Clare said that “everything done now is from the outside. We’re looking at the lube oil levels, checking temperatures, checking vibrations. The digital twin allows us to look inside the process and see how that pump or compressor is running in real time from a process basis.”

The pilot project implemented a data analytics methodology on a booster gas compressor (BGC) installed on the Stampede facility. Two years of historic data from the facility was used to estimate remaining useful life and detect anomalous operation conditions that result in damage to the system, eventually requiring a replacement or causing a shutdown, Pertuso wrote in SPE 210106.

The team tested the dynamic digital twin by going back in time and enabling the twin to consume process sensor data starting months before a known unplanned failure occurred. Then they evaluated how early they could have picked up signs of malfunctions through the twin and assess if something could have been done to avoid those issues.

The team was able to detect the anomalous operating conditions for the BGC about 6 days prior to a defect being detected by the maintenance crew on the discharge cooler, Pertuso said, adding that it is “possible to avoid damage to the gas compression system if we can detect when the BGC is being operated in an anomalous operating condition and prevent early replacement of expensive equipment.”

A goal of the project is to extend the real-time predictive maintenance and reliability effort across the entire company.

“Digital twins for us are a journey. We’re trying to create a system of engagement, to give our operation teams better tools, better situational awareness of how things are, and to make better decisions,” said Pertuso. “It will take time, with the other assets looking at this journey to learn and then start making their own mistakes, and not repeat the ones that we made.”

For Further Reading

SPE 112221 From Data Monitoring to Performance Monitoring by Michael Stundner, Schlumberger, et al.

SPE 181683 Deployment of a Generic Expert System to Rank Operations Business Opportunities Automatically Under Ever‑Changing Economic Conditions by Michael Stundner, myr:conn solutions GmbH, et al.

SPE 203022 MS Automated Well Portfolio Optimization for a Mature Hydrocarbon Field in the Middle East by Nagaraju Reddicharla, ADNOC Onshore, et al.

SPE 210106 Stampede Digital Twin: An Advanced Solution for Process Equipment Condition Monitoring by Dryonis Pertuso, Hess, et al.