I recently had the privilege of reading a piece by Lisa Welchman about Digital Governance. In her article she states that, “we need to do a better job of governing both the internet and web.” And I wholeheartedly agree with her. She mentions Facebook and Twitter (formerly, now X) and the manners in which the two tech giants handle their data. They have been subjected to several data breaches. In fact, X had a data breach earlier this year, where 200 Million X User Records were released through a data breach, where a hacker exploited a vulnerability in what was then Twitter’s bug bounty program.
Sure, it’s a social media website, but X can be connected to several aspects of a users online presence. And what is currently an X data breach, can become data breach of your bank. In fact, Equifax has been the subject of a data breach in the past decade alone, and that was a result of their lax security and the manner in which they handled the breach. So clearly, there are measures that need to be in place to make sure sensitive data isn’t being leaked at the rate it is.
In my readings following the preliminary reading of Welchman’s piece, I found an entry from Volume 162 of the Journal of Business Research by Marvin Hanisch et. al. In the abstract, they state: “…we propose a typology of analog, augmented, and automated governance modes, each associated with specific control, coordination, incentive, and trust mechanisms.” Wherein the idea would be to not solely rely on one mode of governing to fully cover the scope of a company or organizations digital governance.
I think that this format, in particular, is very interesting. Because it suggests that while some components of your digital governance are something that does not have to involve a human in it, and can simply be processed through something programmatically (for example), there are parts of it that need to be viewed by a human being to determine if that decision is appropriate. And if it isn’t, then the proper response should be formulated and set into motion. Automation in this case could also mean a method that is baked into the day-to-day proceedings of a company, such as a policy that is in place that humans interact with daily, but is not necessarily questioned, whereas something that is augmented or analog could be something that involves the intervention of upper management in a company.
In the final section of her piece, Welchman states: “We’ve seen, at scale, what happens when digital products and services are developed with no accountability to a strategy and to local and global policies.” And my final remarks on that are as follows: overall we need more accountability inside and outside of companies when it comes to the policy that governs the flow of data from one place to the next. Whether it needs to be codified into law, or built into the structures of every single company that has any amount of control over the data we give to them. For a certain technology to be relevant in a decade, it needs to be safe. For it to be safe, it needs to be appropriately governed.


Leave a comment