A new Chief Data Officer at the Army Futures Command

Featured

Vol. 17 // 2021

Led by General John Murray, The Army’s Future Command (AFC) is a US Army organization that seeks to bring modernization and transformation to our warfighters. A command centered around exploring the potential of leading-edge technology for our armed forces, AFC brings together some of the greatest minds in the US military and technical communities. To understand how the leaders of AFC tackle these opportunities, Cognitive Times interviewed the Army Futures Command’s newly appointed Chief Data Officer, Colonel Matthew Benigni, PhD.

As a scholar, researcher, veteran, and data scientist, Col. Benigni provides us a glimpse into the thoughts, challenges, and opportunities of being a technical adviser for the US Department of Defense. He talks with us about his successes as the former Chief Data Scientist for The United States Special Operations Command (USSOCOM), his time at Carnegie Mellon, and what he looks forward to at AFC.

CT: In your former role as Chief Data Scientist for USSOCOM’s Global Analytics Platform, you developed machine-intelligence workflows for national security missions. Can you tell us more about the purpose of that project and its significance?

The Global Analytics Platform’s (GAP) charter is to make our supported command’s data accessible, useful, and insightful. We have prioritized our data flows and built a common architecture and set of data services that provide consistent capabilities across multiple endpoints spanning on-premise and multiple cloud environments. This data fabric integrates our command’s data holdings into both operations and intelligence decision making and is accelerating digital transformation. I think the compelling nature of the special operations mission and a focus on direct support helped both our Global Analytics Platform team and our supported command dramatically improve “organizational data literacy” over the past four years.

CT: The GAP placed heavy emphasis on operationalizing AI and ML techniques to make data insightful. How did you and the SOCOM team approach this mission? What were the core pillars of success?

Our successes start with command emphasis. We embrace the need to grow our organizational data literacy. Whether as a technologist, operator, analyst, or logistician, our commander’s assert that data literacy is a professional responsibility for today’s warriors. The GAPs iterative agile development serves as an “upskilling” program as well as an innovation lab. We enable our warriors and technologists to learn enough of one anothers’ disciplines to deliver high-impact machine-intelligence workflows.

We use the physics of oil drilling as a metaphor to explain our transformation strategy in an understandable way. We assert that there is an organizational progression required to operationalize artificial intelligence and machine learning similar to Rogati’s “The AI Hierarchy of Needs” published in Medium in 2017. We compare extracting mission value from data to extracting oil from a well, and let depth be analogous to our command’s organizational data literacy, or the ability to deliver mission impact from data. In our metaphor we must “drill” through the phases of making our data accessible and usable before resourcing AI and deep learning into production data pipelines. We use the concepts of rotation, torque, and friction to shape resourcing and prioritization decisions.

Pillar 1: Drilling Must Start with Rotation

In our case a “rotation” is an iteration of an analytic pipeline development cycle that produces novel insight for an end-user and solicits end-user feedback. In other words an attempt to enable an operational effect for end users and solicit feedback. Ultimately our technologists need functional understanding of our military and intelligence professionals’ needs to effectively select and implement appropriate methodologies and technical solutions. Our military and intelligence professionals need to become smart consumers of analytics. This shared understanding requires repetition. Today’s information management and business problems will deliver near-term value with a convenient downstream benefit, we grow our data literacy by doing.

Pillar 2: Train for Torque

In our drilling illustration torque is analogous to our analytics team’s ability to employ new methods and technologies for mission impact and is a function of the team’s relative mastery of operational understanding (operator and analyst driven), methodological breadth (data scientist driven), and DevOps practices (engineer-driven). We try to target best practices from industry and adopt them incrementally. We actually devote story points in our sprint planning to mastering specific fundamentals as if they were features.

Premise 3: Resource To Reduce Friction. It Is More Expensive Than You Realize

Premise 1 and premise 2 reinforce one another. Our ability to produce torque increases with rotation. Moreover, our ability to construct this type of problem solving environment aids in recruiting and retention. We want problem junkies. For these reasons it is critical to identify and remove barriers to “rotation” and “torque”. We use friction to represent effort spent delivering mission value that does not result in increased operational understanding, methodological breadth, DevOps practices, or user-oriented design. This refers to things like technical debt, data cleaning, and infrastructure limitations. These pillars have helped us mature from our initial focus on applications and algorithms to delivering high performing, scalable application development and data science platforms to our command.

CT: What best practices from industry and commercial contexts have you found to be the most translatable to military contexts?

First and foremost, our incorporation of the Scaled Agile Framework (SAFe) within the GAP and Scrum within our data science teams has helped us tremendously. Specifically, the idea of limiting Work In Progress (WIP) and building in feedback loops has required deliberate effort with our government leadership (to include myself). Limiting WIP is not how we grow up as leaders within the Army. We typically sign up for too much, but then complete the important tasks. Our military leaders needed training to understand and observe the costs associated with overcommitting with respect to WIP.

We have also embraced many of the concepts from the Inner Source Commons community. One of the benefits of the DoD and IC culture is that our technologists want to share their work. We curate a shared enterprise code library within the GAP which has forced us to embrace concepts like automated code review and documentation, dependency management, automated security scanning and so on. Devoting effort toward code accessibility and reusability has driven quite a bit of professional growth within our team. Many of those fundamentals are likely common within industry and on mature open source projects. A good dose of humility and respect for industry best practices helps us endeavor to incorporate better practices over time. Scalability and reusability are the goals.

We also drew from Airbnb’s commitment to knowledge sharing as they grew their data science teams, and built a fairly simple, open content management system within our secure networks for data science tutorials. One of my favorite professors at Carnegie Mellon, Jim Herbsleb, claimed “writing is nature’s way of telling you how fuzzy your thinking is.” I cannot think of an example where the publication process has not caused me to improve my methods. Our commitment to publishing our findings in a forum accessible by colleagues helps with reusability, but also improves our work in the process. It grows “torque” to draw on our drilling metaphor.

World digital concept, copyright ArtBackground via Creative Market

CT: From your time at USSOCOM and Carnegie Mellon, you have prioritized research. What role does research and experimentation play in your position and why is it significant to you?

The fundamentals of defining a problem, stating a hypothesis, addressing bias and uncertainty, and communicating results are inherent to just about any data exploitation problem. Timelines, techniques and rigor may vary between use cases and disciplines, but the fundamentals are consistent. One of the things we have done in the GAP is dramatically reduce the barriers to iterative feedback with respect to applied research projects. By developing a common data science platform, and a team of practitioners able to provide direct support to deployed special operators, we now iterate with Federally Funded Research and Development Corporations like MIT Lincoln Labs and MITRE at much faster rates. We are able to collaborate and share workflows in our unclassified data science environments, and our data science teams can test the workflows on classified data and provide feedback. It has taken us a while to be able to do that efficiently, but that feedback is invaluable to our research partners and comes at relatively low cost to our supported command. In fact, sometimes this type of experimentation can provide value in near real time.

CT: How has your time working on GAP impacted or informed your approach as the new Chief Data Officer at the Army Futures Command?

I think I have enough experience at this point to walk into any new job with humility. Most problems are far more complex than they appear at first glance, so I am prepared to do quite a bit of listening and asking “why” questions. I do not think you can deliver sound policy, infrastructure guidance, or analytics without a deep understanding of the organization’s value streams and practices.

I will however immediately look to implement industry best practices in terms of fundamentals. We will get a little better each sprint. The longer I am in this business, the more my life feels like a tech version of Gene Hackman in “Hoosiers.” I think it comes down to enabling and growing our people through sound fundamentals. The great news is that the data science community in the DoD, the IC, and industry has been extremely collegial in my experience. Nobody will spoon feed you, but if you are putting in the work and looking to grow, colleagues tend to be generous with their time.

CT: The Army is primarily about people. You’re trying to augment people with machine-aided capabilities, but ultimately it’s the people who matter. How is the Department approaching people differently to ensure that they understand the technology, can interact with it, and can leverage it to better accomplish the Army’s mission?

I think there is general consensus that tomorrow’s force requires greater technical depth. Through innovative training and engagement pipelines at the AFC Software Factory and Artificial Intelligence Integration Center (AI2C) Army Futures Command is helping the Army grow that technological depth needed on tomorrow’s battlefield. AFC’s Software Factory is providing boot camp-like experience to Army soldiers by enabling them to learn software engineering and design by solving tractable, relevant Army problems through application development. Dr. Matty’s Team at the Artificial Intelligence Integration Center (AI2C) at Carnegie Mellon is spearheading a pilot for the Army that consists of four major lines of effort. Educating Scholars – Masters and Ph.D. level data scientists and engineers. Training AI Cloud technicians – bootcamp style workers that can build and maintain cloud applications. Executive education – for Army senior leaders to understand AI and how to properly employ this workforce. Lastly, AI user education – MOOC style training intended for the majority of the Army. Each of these lines of effort are at various stages of maturity. In parallel, the AI2C is developing the ‘primary weapon system’ for this AI workforce. Coeus is a next generation, community focused, AI and data science development environment that provides secure access to tools and data. All of these efforts are utilizing the scalable platforms being developed by the Army’s Enterprise Cloud Management Agency led by Mr. Paul Pucket. Initial training pipelines are important, but figuring out how to then employ these new skills is critical as well. The Army is experimenting with new ways to manage this technical talent in order to ensure data literacy starts to permeate the force.

I often have people ask me, “Why are there so many efforts out there where DoD is building software?” I think we are building our technological depth within our military and civilian workforce. That technical depth is driving our modernization efforts, and I believe it will dramatically improve our ability to partner with industry.

About Colonel Matthew Benigni, PhD

Colonel Matthew Benigni currently serves as Army Futures Command’s Chief Data Officer and Directs the Data and Decision Sciences Directorate (DDSD).

He spent the last 4 years as the Chief Data Scientist for the Joint Special Operations Command’s Global Analytics Platform, and his work advising forward deployed data scientists on Tactical Data Exploitation Teams (TDETs) won the 2019 USSOCOM Lambersten Award for Innovation in Support of Operations.

He was commissioned as an armor officer in the United States Army and serves as an Operations Research Systems Analyst. His assignments include commanding a company of tanks and infantry in Eastern Baghdad from 2004-2005, serving as an assistant professor in the Department of Mathematical Sciences at the United States Military Academy, and providing strategic planning support at the United States Special Operations Command, MacDill Air Force Base, FL, USA.

He has a B.S. in operations research from the United States Military Academy at West Point, NY, a M.S. in applied statistics from the Colorado School of Mines, and his doctorate Societal Computing from Carnegie Mellon University’s School of Computer Science. His academic research in the area of online extremism and propaganda is highly cited, and he continues to focus on applied machine learning and data mining in support of current operations.