At the GITEC 2023 Conference hosted in Annapolis, MD by association ATARC, government CIOs shared some of their insights into where hot technologies like AI, but also less “sexy” technology priorities such as IT modernization and big data analytics, are helping to advance government technology initiatives. AI continues to get the mainstream attention of the media, but specific AI and machine learning technologies such as large language models (LLMs) popularized by ChatGPT and Google Bard are starting to show their value in government applications.
Benjamin Berry, Chief Information Officer, Bonneville Power Administration, U.S. Department of Energy shared insights into how technology is helping advance their efforts. The Bonneville Power Administration provides power infrastructure in the Pacific Northwest and is a key aspect of the US power infrastructure. As such, one of their key priorities is focusing on the modernization of their IT systems. This focus will help not only keep the systems effective and efficient, but also more resilient to emerging cybersecurity and physical threats to power generation and transmission infrastructure. As explained by Mr. Berry, most of the agency’s efforts are focused on enterprise modernization, moving from legacy systems to emerging cloud, on-premise, and hybrid systems.
Gundeep Awhluwalia, Chief Information Officer, Office of the Chief Information Officer, Deputy Assistant Secretary for Operations, U.S. Department of Labor shared insights into challenges with managing the sprawl of the agency and the various roles that the agency is involved in. He explained that there are more than 650 locations for the Department of Labor (DOL) across the country, with varying issues of allocation of capacity. There are many, different inter-agency connection points which have major challenges with regards to technology implementation and adoption.
As such,Mr. Awhluwalia shares that they are making big moves on digitization and modernization. He also explains that the Department of Labor has deployed more advanced technology including the use of AI and machine learning in various systems. One implementation shared by Mr. Awhluwalia is the use of AI for autocoding of Occupational Safety and Health Administration (OSHA), with currently over 97% of reporting data now being autocoded by AI systems. As such, these implementations are allowing the agency to shift humans from low value work to high value work. He further shared that they are working on improving aspects of data quality and spending more time with other organizations to develop best practices in this area.
Guy Cavallo, Chief Information Officer, Office of the Chief Information Officer, U.S. Office of Personnel Management (OPM) shared additional insights into where and how OPM is making use of advanced technology. OPM developed a new strategic data plan and IT strategic plan to replace an existing plan that was released in 2014 that was not being implemented. The OPM has released an agency strategy, a data strategy, and is soon releasing a cloud-centric IT strategy. Mr. Cavallo relates that they opened up a 24/7 help desk to help not only remote workers but folks in locations outside of the DC region. OPM is also making use of AI and natural language processing (NLP) to assist those human-related tasks, including possible use of AI generated content for some of those tasks, a topic discussed at the GovFuture Forum event in DC in April. Mr. Cavallo says that the U.S. Department of Veterans Affairs (VA) AI Chatbot is taking 4% load off call center volume and wants to do the same at OPM. Currently the OPM is facing challenges with AI related job position descriptions, but is using AI to come up with those descriptions.
Michael Anthony, Chief Information Officer, Office of the Managing Director, U.S. National Transportation Safety Board (NTSB) talked about their own agency’s needs and adoption of technology. While driving the point that the NTSB is not part of the Department of Transportation (DOT) and is an independent agency, they do follow some of the guidance of agencies of the DOT, but they have their own strategy as well. NTSB is focusing on zero trust and identity and just starting to exploit the cloud. At the GITEC event, Mr. Anthony didn’t want to talk too much about NTSB’s current usage of AI, but related some of the possibilities. NTSB is struggling with Human Capital issues, and is focusing on using data to transform the “customer experience”.
Dr. Gregg ‘Skip’ Bailey, Deputy Chief Information Officer, U.S. Bureau of the Census, U.S. Department of Commerce shared with the GITEC audience his thoughts on the use of technology, including their advanced use of AI and NLP. Dr. Bailey talked about the challenge of maintaining 130 different surveys, not just the once-a-decade national census. This means that Census will be looking at the use of data lake, cloud, and AI technologies to provide more NLP and analytics. Dr. Bailey relates that they are in the middle of building an enterprise data lake and dealing with challenges with siloed data across their organization. Currently, Census is using NLP for analysis of data, and looking to use chat systems for making sure that their data is seen as authoritative and authentic. He wants to make sure that the crawlers of data are getting to Census data and treating that as the “original and authentic” source of data that is often quoted. Dr. Bailey mentions that records management is a significant issue as well, and a key longer-term focus.
IT modernization, AI, big data analytics, cyber and zero trust, and migration to cloud continue to remain hot topics of interest for the US Federal Government. Through continued and ongoing conversations agencies are able to learn from others, employ best practices, and make the best use of taxpayer dollars while driving mission priorities.