It’s the start of a busy month for the HanAra team as members participate in conferences on both coasts and the middle of the United States. This past week, we joined transmission and distribution experts in sunny San Diego at DISTRIBUTECH 2023. It was our first time attending the event, and we enjoyed the open conversations about the future of the transmission and distribution industry.

This year, the conference explored 13 tracks essential to the T&D industry, including Data Analytics. As a data solution provider, we believe in the power of data and look forward to helping the T&D industry broaden its use of data analytics to achieve operational excellence. When expanding the use of T&D data analytics, we always recommend considering location, amount, and resolution.

Towards the data-driven utility.

The Data Analytics track is focused on data driven insights in grid operations and customer experience, advanced network model management leveraging artificial intelligence and machine learning and current business drivers for next generation data architecture. The Data Analytics track is designed for a broad audience of utility and industry attendees including IT executives, Operations executives, Data Scientists, Network Engineers and anyone who is looking for real-world examples of how best-in-class companies capture and analyze data for operational excellence.

 -DISTRIBUTECH’S DATA ANALYTICS TRACK OVERVIEW

Location

The service area for the T&D industry is large and diverse. For example, in HanAra’s hometown of Austin, Texas, the local energy company has over 50 substations and counting across the city. Unlike a manufacturing plant or traditional power generation project, the location of T&D sites creates a network of decentralized assets. Therefore, this decentralization means it is operationally impossible to have a person monitor all the assets on-site. As a result, the T&D industry must rely on off-site monitoring to view its current asset operational status.

Understanding current operations requires that the disparate assets are not black boxes. From the transformers to relays to switches, the equipment must properly communicate with other devices and share their real-time data. The need for communication is where standardization and interoperability can simplify implementation and maintenance. For example, by following IEC 61850 (international communication protocol standard for intelligent electronic devices and electrical substations), an organization simplifies the method for getting the necessary data off-site. This standardization also makes connections to third-party solutions like a data historian and predictive analytics solution easier.

Amount

With a large and diverse set of locations, the amount of data for the T&D industry is also vast. This data serves as one of the foundational pillars of the smart grid concept. Already, there is a large amount of sensor data available in the industry, and the sector is researching additional requirements. As the Department of Energy notes in their Sensor Technologies and Data Analytics report, the industry must optimize “the placement (type, number, and location) of sensors subjected to application specific objectives (for example, reliability improvement) and constraints (for example, physical placement and cost/budget limitations).”

As more sensor data comes online with the already sizeable existing sensor data, data management and storage will be essential to future success. The increasing amount is where the edge computing concept will help reduce inefficiencies and improve performance. The amount of data available combined with the dispersed locations makes the T&D industry a great environment to apply edge computing. For example, at a substation, rather than sending all transformer data to a centralized data historian at headquarters, the edge solution will send changes of values and critical information.

Edge computing allows devices in remote locations to process data at the “edge” of the network, either by the device or a local server. And when data needs to be processed in the central datacenter, only the most important data is transmitted, thereby minimizing latency.

-Microsoft

Resolution

Beyond the amount of data available, there has also been an explosion in the resolution of data available in the T&D industry. Data analytics needs improved data resolution to help ensure data quality. Data quality includes correctness, completeness, consistency, and timeliness. But a higher resolution only sometimes equates to more value. For example, when looking at consumer meter readings, a smart meter does not have to provide a reading every 1 second to provide information on electricity usage. A lower data resolution is sufficient.

When considering data resolution, the industry must ask itself what is needed to answer the important questions related to reliability and sustainability. For example, the question may be about future energy load requirements for metered data. For substation equipment data, what potential failures exist? This information gives a clearer idea of the required data resolution. And the good thing, regardless of the resolution, is that technology has progressed where software solutions like data historians are capable of managing data at the nanosecond level, so you won’t run into a software obstacle.

Learn More

Missed us at DISTRIBUTECH 2023 and are interested in learning how we help with T&D data analytics? Reach out to us today. We help provide users with the necessary data insights to transform an organization’s vision into reality.