Capturing the right data with the right tools to enable Defence strategy, capability, and operations.

Thinking

Smart, fast, unobtrusive: capturing the right data with the right tools to enable Defence strategy, capability and operations

Global events such as COVID-19 and the rapid development of technologies such as artificial intelligence and high speed communications are driving a greater uptake of digital technology across many asset intensive industries. This is raising awareness and thinking about the value of the data that these technologies create and provide access to.

For example, in the Energy sector, data is becoming the key to effectively managing its complex networks and creating better visibility, control and coordination across assets. In Telecommunications, robust data on asset condition, utilisation and costs is helping to improve capacity utilisation and optimise asset maintenance.

For Defence, the ultimate vision for data is to help deliver the sector’s strategic goals: realising a sustainable and safe Estate, improving the efficient use of investment capital and industry resources and strengthening industry capacity and capability. However, raw data has little value on its own.

In an increasingly digitally enabled operating environment, robust and timely access to data on the condition, operating performance and cost of assets is essential for effective day-to-day operational continuity, as well as for informing robust and defendable strategic decisions across the asset lifecycle.

There is a wide array of technologies that can enable more efficient and accurate monitoring of asset condition, usage, performance, and costs. This offers the tantalising prospect of enabling managers and decision makers the capability and agility to move, adapt and transform organisations more quickly than ever before.

Data must be provided with context (in technical terms, by leveraging metadata and appropriate data models) and connected to the right technology, people and processes to unlock its value, whether by extracting insights to inform better decision making, or by using data to drive process innovation and automation.

Within the Defence sector, the journey has started to try and make these possibilities a reality. There is a rapidly growing body of information being captured through multiple channels: data on infrastructure, buildings, people, and processes, and also data from vehicles, soldiers, weapons systems and machinery. Data also flows from the Defence supply chain, such as information from inspections, during design and construction of facilities and equipment, and from the provision of contracted services.

To capture data from these many – and varied – sources, an array of new tools and technologies are being used. At the prosaic end of the spectrum there are applications that run on mobile devices, like tablets, that can facilitate more efficient capture of data by personnel on site, such as Aurecon’s Field Force application for condition assessments. ‘Go anywhere’ geospatial tools such as Lidar and Simultaneous Localisation and Mapping (SLAM) are being embraced for their ability to capture data remotely, and there is increasing deployment of autonomous data capture systems by drones, satellites and IoT sensors.

The processing and interpretation of this data is increasingly being facilitated by advanced analytical technologies such as Artificial Intelligence and machine learning. It now possible to automatically identify and assess many types of faults, defects and anomalies using such techniques, such as the detection of cracks in images or video footage of building facades and roads, or the detection and classification of landslides from satellite imagery.

These technologies and techniques can facilitate the efficient capture of data, contribute to building valuable databases, and enable the provision of rich and relevant information to decision makers in a more automated way, leading to better more timely decisions.

But with such an array of technologies at our disposal, how do we leverage these in the best possible way? How do we make sure we’re using these techniques to capture not just any data, but the right data, to create real value? It all starts with a clear view of the end goal, focusing on people and purpose first to understand what data is required now and in the future. This provides a solid foundation for making decisions on the what hardware and software needs to be deployed.

The gap between data and value

Handling the large amounts of data currently being collected across the Defence sector, and deriving benefit from its capture, is a complex challenge – especially for one of the largest landholders in Australia with around 2.6 million hectares, 500 properties, 72 major bases, training areas, ranges, research facilities and office accommodation supporting a combined workforce of around 100,000.

With such a dispersed, diverse and massive Estate, and the equally diverse and massive proliferation of data, the current reality across parts of Defence is a disconnect between the data being captured and the ultimate benefits it can bring – now and into the future. This disconnect plays out in many ways, including:

  • Disrupted operations
  • Failing assets from extracting the wrong, or out-of-date, data
  • A constrained workforce focused on processing/compliance activities instead of decision making and adding value
  • Unnecessary cost blow outs from slow manual data capture and analysis
  • Reinventing the wheel with re-collection and data remediation

As increasingly large amounts of data are collected it can also lead to further challenges associated with capturing the right data, providing the right context for the data, and getting this to the right people at the right time. This requires well designed data governance, management and curation processes, and effective data storage, analysis, and reporting capabilities.

Also challenging is making decisions about which data to throw away, falling into the trap of keeping everything ‘because it might be useful down the track’, creating a huge data swamp and making the right data difficult to find.

Transforming challenges into opportunity: the must haves

Crucially important to overcoming these challenges, and maximising opportunities, is ensuring that the fundamentals of smart, fast and unobtrusive data capture and management are in place. A culture of innovation and continuous improvement (i.e. good to great) which empowers people and balances the need for technical accuracy with bravery to experiment, sets the tone from the top.

Mastery in choosing the right capture tools, establishment of robust policies and processes for handling big data, frameworks for deciding what data to keep for how long, and how to guarantee data quality (accuracy, validity, integrity, completeness, timeliness etc) are fundamental to success.

Must-haves include defining and assigning critical data management roles across the organisation (such as data owners and data stewards), and providing access to specialist resources such as data engineers, data architects and geospatial data experts to develop capabilities, processes and systems to capture and manage data more effectively, and specifically to:

Create a forward-looking data strategy

Create a strategy which considers where blockages are today, e.g. inadequate data governance processes, data management capabilities, data architecture and enabling technology; but also looks at how this will evolve over the next 5-10 years. Start by understanding what you are trying to do with the data, work back from that, and then work forward from existing technological enablers.

For example, predictive maintenance of certain assets might be an important future goal. This will require the accumulation of enough historical data by asset class from relevant sensors and inspection and maintenance records to train a machine learning model to predict future failure based on live sensor data. The type, frequency and volume of data required for training the future machine learning models will determine what sensors need to be deployed today.

Identify what data is necessary to support better decision making on assets and operations

Start with clearly identifying the end use for the data – who will use it, how will they use it, what decisions will it inform? The answers to these questions will shape what data should be captured will avoid the pitfall of capturing everything ‘just in case’.

Establish robust data governance policies and processes

Establish the ‘rules, roles and rituals’ that will ensure data quality and availability, and minimise risks around data security, privacy, confidentiality and risks.

Enable contextualisation and shared understanding of asset condition data by creating a virtual, real time view of asset conditions from diverse information sources

This asset information often has a spatial and temporal dimension, and there may also be clearly defined relationships between assets (for example, hierarchical relationships in which several related assets comprise a system, and functional relationships, which define dependencies between different assets). Capturing this metadata allows the asset data to be in space, in time, and in a relational context, in a way that accurately reflects important properties and behaviours in the physical world.

In effect, this provides the basis for building a digital twin that can evolve over time to represent an increasingly rich and realistic representation of the real world. Relevant parts of this digital twin can be shared with operators and users or assets, maintenance teams and contractors as required. This makes it easier to understand the context, relevance, and relationships between different data sets, and facilitates a shared understanding of who needs what data and for what purpose.

It also highlights any important gaps in data, or issues with data quality, enabling the identification and prioritisation of initiatives to improve data acquisition and management. Finally, a digital twin provides the ability to perform simulations and virtual experiments to inform decision making in the real world.

Leverage existing data and platforms

Extract insights from current data resources to deliver immediate value while building the capability to enhance this data resource over time. We are currently working with Defence's Estate and Infrastructure Group (E&IG) to help identify what insights can be derived and decisions informed through the analysis of existing data held within the Garrison Estate Management System (GEMS).

Ensure new platforms dovetail into Defence systems

Take advantage of new capabilities provided by modern software and hardware solutions, but with a clear view of how this will interoperate with key legacy systems, and how it will align with current and future organisational capabilities and user needs. This requires an accurate understanding of who is using what information, how they are accessing this information today, what systems this relies on and how this may change in the future.

Provide great analytical capabilities

Bridge the analytics gap by building on Defence’s existing skills with experienced data engineers, data scientists and geospatial data specialists.

Understand the technology landscape

Understand how technology is maturing, to inform decisions on technology adoption, capability development and future-proofing. Digital enablement involves alignment of technology across three different layers:

  • the physical layer, or the hardware that interacts with the physical world such as autonomous data capture (e.g. drones, robots, satellites, sensors and IoT networks) or augmenting data capture by humans (e.g. using apps on mobile devices, cameras or handheld laser scanners)
  • the digital layer, or software-mediated processes that control the flow and manipulation of data, including disruptive technologies such as cloud platforms, edge processing and machine learning
  • the decision layer, where human/computer interfaces and data visualisation can assist staff, contractors, and management to make better operational and strategic decisions. A current example might be a dashboard or application that provides real time information and alerts on asset performance. Soon, this augmented and virtual reality will play an increasingly important role in certain use cases, such as field and remote inspections.

Data Defence needs, without disruption

Data is an essential backbone for making better asset decisions at both a strategic and tactical level, quickly and with agility, but to do this requires a more strategic approach to data capture and management. With the current pace of data acquisition across the Defence sector showing no signs of slowing down, there is an immediate and critical need to design and embed processes and systems to capture, manage and analyse (the right) data more effectively.

Adopting new technologies that enable smart, fast and unobtrusive data capture will progressively bridge the gaps between the cost of collecting and processing data and the value that can be delivered by using this data.

There are opportunities to accelerate and automate data collection processes in many areas, thereby greatly reducing costs and, in many cases, significantly improving data quality.

Leveraging these technologies and methods will facilitate better decision making and better investment choices in the near term, and will help to build an increasingly valuable data resource that will support the future deployment of game-changing technologies such as artificial intelligence, automation and digital twins.

Ultimately, it will be possible for Defence to efficiently collect and process most of the information required to enable better decisions and expanded capabilities, and to ‘invisibly’ – automatically and continuously – collect relevant data without any impact on day-to-day operations.


About the Authors

Adam Rankin has experience in project and programme management of multidisciplinary, building and infrastructure projects in Australia and overseas. He has spent 13 years with the Australian Army in operational and project delivery roles, where he developed strong project management skills, with particular expertise in the planning phases of major projects and programmes.

Eric Louw is Director, Data, Risk and Analytics at Aurecon. He has twenty years' experience with leading management consulting firms and as an independent strategy consultant. He is the co-author of three business books, as well as numerous articles and academic papers.

Rebecca Strang is Aurecon's Geospatial and Land Infrastructure Capability Leader at Aurecon. She has twenty year’s experience in the geospatial sector, with a background in land surveying and experience leading digital transformation across Aurecon's NZ business. She is currently focused on exploring digital twin use cases with clients (the benefit to both organisations and communities) and more broadly improving productivity in construction through capability building and a culture of innovation.

Unfortunately, you are using a web browser that Aurecon does not support.

Please change your browser to one of the options below to improve your experience.

Supported browsers:

To top