Author
Rob K.
rob-killbride

Most organisations appreciate the importance of data and analysis to effective, evidence-based decision making.

Every day, we discover new ways to collect and analyse data using entirely new data sources like images or audio files. This rapid evolution, combined with the roll out of artificial intelligence (AI)-based technologies, is making advanced analysis available to anyone that has a good (or bad) question to ask. So, what does this mean for public sector entities?

How data and analysis is evolving

The amount and variety of data available to entities is greater than it has ever been – but the sensitivity of this data is also greater.

For example, consider the different ways staff communicate with each other in the office.

Historically, this communication was limited to formal meeting notes, emails, or phone call logs. Today, your entity has access to data sources such as:

  • chat window interactions 
  • video recordings of video calls and the associated audio transcripts
  • usage logs from meeting rooms
  • staff seating logs in hot desk environments.

More data, especially when combined with the emergence of AI-based tools that make it more accessible for analysis, makes it critical for entities to understand their responsibilities.

Cartoon of a cloud

The Queensland Government published a guideline on the use of generative AI in August 2023. It provides key considerations and recommendations for entities and employees when using AI, and some great examples of what that looks like in practice. 

One of the key principles in this guideline is that everyone is responsible for understanding the classification of data (including classification levels like ‘sensitive’ and ‘protected’) and that employees should not share, input, or upload information into generative AI products and services that their entity has not approved for use.  

Entities should also consider the sensitivity of questions they ask AI, especially if they relate to work that is not in the public domain. Some tools can be configured to retain and reuse any information you put into them, including the very questions you ask. This is used to train the model but could also provide unintentional insight into strategic or operational decisions you are exploring.

The Queensland Government knows that generative AI has the potential to offer insights to support the delivery of better public services, but also acknowledges the importance of managing the data associated with it. To this end, the Queensland Government has published an Artificial intelligence governance policy and a Foundational artificial intelligence risk assessment framework. It is also in the process of rolling out its own AI-powered assistant, QChat. For more information about this see: 

What should entities consider? 

As new, exciting ways to collect and use data to generate insights emerge, entities should ensure they balance that excitement with respect for the data, and the people that data is associated with. 

This is important because new technologies make it easy for anyone in an organisation to perform analysis, not just those who work in areas that focus on, and have controls over, data governance. 

For example, an entity that has audio transcripts of staff video meetings should consider some key factors before using them in analysis. Did the staff involved know the transcripts were being created? Did they know they might be analysed? Will analysis be elevated to a level that ensures no individual can be identified? 

An entity’s reason for using data in analysis is often to benefit those the data is based on. But entities should still consider whether these individuals would be happy with the data being used in this way.

With all this in mind, before collecting and sourcing data, or exploring using AI-based tools, entities should ensure they:

  • understand the sensitivity of the data. Entities should consider this both from an organisational perspective and the perspective of the subject or the data owner 
  • clearly define the goal of the analysis. The more focused entities are with analysis goals, the better they can understand and justify what data they need
  • determine what data is actually needed and what would be nice to have. Doing something because you can doesn't mean you should. For example, if an entity wants to understand the age profile of its workforce, it could source an age range rather than specific dates of birth
  • understand who the data owner is. All data in an organisation will have a group or individual responsible for its collection and governance. These staff protect the data and help others to understand its quality and sensitivity 
  • are transparent with the data owner. Doing so can help prevent any data misuse, regardless of initial intentions
  • have clear plans for sourcing, storing, and disposing of data. Just like entities should only source the data they need, they should only keep it for the required retention period
  • understand who needs to see what and at which stage. The audience for the analysis usually does not need to see the raw data. Entities should work closely with their information technology staff to ensure data is secured at each stage of analysis
  • adhere to any relevant policies in relation to the use of AI-based technologies
  • comply with government recordkeeping and privacy requirements. 

Remember, while discovering new insights using data and analytics can be fun and exciting, it should never be at the expense of our ethical values or moral compass. 

The subject of the data point should always be top of mind. Entities should ensure this is a key consideration when they perform analysis and use it to communicate or make decisions.

Resources

Related article

Cyber security risks represent one of the most significant threats to all organisations, with attacks increasing in intensity and frequency.