Part 2: Ethics in AI

In attempting to come to grips with where ethics is being applied to AI I have found through formal research approaches that both private and public sector organizations and companies have created frameworks or operating principles for ethical use of data and algorithms.

That is the good news.  The bad news is they are not being enforced or really being used.  It appears as though lip service is being paid to the recommendations.

All of this got me to start to think of fundamentals of ethics and how and why it can be applied for AI or cognitive enterprises and that it would work.  

One of the challenges is if you look into the field of Ethics you are delving into a subject that has been discussed, argued and loosely defined over many years.  So ethical theory is not new yet we are attempting to apply it to an area that did not exist.

But that should not be a concern for us. If you look at how ethics as be applied and re-applied to all aspects of humanity over the years.  And in recent decades areas such as equality, abortion, class differences, hiring practices etc.  So AI will be nothing new to ethical theory.

The challenge that is raised is defining what really are ethical principles we can apply that make the most sense to ensure people are protected.

One of the guiding principles I was trained in was the area of equal consideration of interests.  So I will delve into what that means in greater detail in AI in the next post.

Have a great day !

Part 1: A Journey into Practical Ethics and AI

Hello World,


I have not posted in a while since I have been a little pre-occupied doing my job.  I read daily and keep so many articles on AI and I see the tide come in and go out on opinion on where AI is in being used and how advanced or how simplistic it currently is.

But one thing seems to be ringing true and that ethical practises in AI need to be addressed.  I here about some companies calling for regulation or “laser” regulation which sounds to be a bit concerning.  

When I work with clients the discussion of data and the ethical use are discussed because without clean and ethically clean data the AI algorithm or digital platform will fail.  Once you get into the AI side of the discussion then you are into areas of bias, variance and transparency.  

So from my perspective I see ethical practices to being fundamental in all aspects of implementing AI.  I have said this many times but regardless of industry or technology you can reduce the discussion to a strategy around People, Process and Policy.

So with the rubric in hand of those three principles I am going to set off on a journey step by step to look at how practical ethics can be applied to people, process and policy in the realm of using AI technology. 

Wish me luck and may the trade winds be at my back.

BTW the images may be a little random but they are mine and they just give you something nice to look at while thinking about applying ethical practices to your data and AI journey.

An Executives Guide to Cognitive Computing Part 1

What you need to know about Cognitive Computing Part 1

In the coming weeks I will lay out an explanation and recommendation on what and how Cognitive Computing can be used.  As always I look forward to comments and thoughts raised.

Technology continues to develop and improve in how we interact with systems and people.  Over the past decade there has been radical developments in how computers make sense of:

·      Text

·      Voice

·      Pictures

Given these capabilities; what will this mean for your agency or program?  The possibilities are endless when one considers the value of integrating text, voice and pictures into the decision making process.   The purpose of the discussion is to investigate the “art of the possible” and to provide an overview of Cognitive Computing. We will conclude with a focus on approaches and recommended next steps.

This new shift in technology provides the ability to automate knowledge and decision making in a more meaningful way.  The challenge of policy change needs to occur to reap the benefits of this new technology.  Cognitive Computing will face some challenges over the next few years.  It will evolve over time as it used by public sector organizations.

Public Sector organizations around the world have too much information to analyze. Traditional Information Technology (IT) systems cannot cope with variety or volume.  Each department or program needs access to more sources of information. The more “dimensions” of information accessed (e.g. geo-spatial, social media, weather data) the better the outcome.

In today’s world employees and programs are asking harder questions through traditional means. As an example; one government agency is trying to analyze the dynamics of the underground economy.  A state government is attempting to understand why municipalities dissolve or succeed.  It is important to apply critical thinking principles via cognitive computing.   There is a need to eliminate the “noise” that exists in today’s information driven society.  Organizations can now use machine learning and hypothesis testing to apply critical thinking against larger sources of all types of information.

Recommendations of this discussion will focus on ensuring a clear objective. And, that the domain of knowledge is complete and defined.  This ensures that the cognitive systems will work successfully to solve complex business problems.

Thoughts ?

Contextual Computing

Contextual Computing unlocking the power of enterprise data Infographic

Cloud Computing issues for the long haul

Governments are beginning to experiment with Cloud computing, but unlike other industries they must deal with more regulations and policies. Governments of all sizes must be very stringent on their requirements for cloud providers. Namely, in-country or in-territory hosting security is a primary concern. Additionally, most governments want their employees at the same site as the cloud infrastructure to ensure that compliance and policies are being enforced. These are some of the short-term issues that must be addressed when using cloud for government purposes.

A few days ago, I read a blog that opined that as cloud computing becomes a commodity, all the major players (IBM included) will back out of the market. I disagree. Since IBM will provide many software services via the cloud that cannot be commoditized, I suspect that no such back-out will occur. Take for example Enterprise Content Management (ECM). One of the premises behind proper ECM is that information (documents, records, images, e-mail, etc.) must be organized in a way that ensures “find-ability” and “retention.” If done correctly through file plans, classification and taxonomy, cloud based ECM can help governments.

Furthermore, for government cloud use, a high degree of security and archiving practices must be implemented (e.g. DoD 5015 Chapters 2 and 4). In order to support a government department over time, Freedom of Information, Document Classification and De-Classification, and up to 500 year records management practices need a robust and dynamic ECM infrastructure. While the cloud services on a tablet or computer that back up your photos and email may become a commodity, the SLA agreements and the information governance practices in the government make this the least commoditized environment of cloud. Cloud computing has only just gotten started, and I encourage you to take a look at what IBM ECM is doing with our IBM Content Navigator UI and Cloud computing for the enterprise.

The importance of being Ernest: with e-mail

I see it almost weekly now that some investigation or inquiry into a scandal or political wrongdoing relies more and more on email as evidence.

Either the emails can’t be found (for obvious reasons) or when they are found they illuminate a further chain of evidence that the guilty party can’t hide.  All in all it points to the importance of a public sector organization to properly manage and store email in accordance with legislative guidelines and record keeping principles.

In the past records management was viewed as a dull and lackluster practice of storing physical documents in the basement of the department.  However; in today’s world it only takes one leaked email of a politician to highlight record management’s importance.

Email is like so many new forms of information (such as instant messages, images, voice mail, social media messages, etc.) have been overlooked mostly as a formal record.  As I work with government departments around the world I have noticed that managing email as a record is becoming more and more prevalent.  The hard part is to manage it well.   Backup tapes or copies will not cut it; therefore proper classification and metadata is needed with a robust governance strategy.  Information Lifecycle Governance (ILG) becomes the means to the end to ensure that risk and security of information in email is managed effectively and efficiently.

ILG then begs a few questions.  Who will own the management of email at the end of the day? Who will communicate and enforce the governance rules to all users ? What will the file plan look like?  I have worked with many organizations on how they can effectively incorporate email into the their overall governance and information management strategy.

The federal team of IBM ECM worked on a pilot project with a US Military agency on automatically classifying email as a record.  I would encourage you to read the whitepaper on this project to see some interesting and eye opening capabilities in being to automate records declaration and management; such as storage savings and efficiency gains.

Can new ECM technology ease the complexity of document declassification


Hidden complexity in document declassification 

The explosion in digital records has made declassification a challenge, but recent document-leak events, including WikiLeaks, have heightened the pressure to ensure that all government documents and records being considered for declassification are managed securely prior to release. Typically, only the agency that creates classified information can declassify it. If an agency produces a record containing its own information as well as information from one or more other agencies, it must not only review its own information for declassification but also must refer the record to each agency that “owns” the additional classified information in question. A final declassification determination on information in the record is made only after all agencies have rendered decisions on their respective equities. 


I believe this structural process has inherent productivity limitations: if even one agency has minimal content in a record, it must declassify its portion. Agencies must alleviate these complexities to improve the efficiency and accuracy of declassification. Integrated collaboration tools with ECM systems would alleviate this as well.


I am seeing more ECM technology being deployed that addresses these issues by not only managing the classification, organization and retention of information but also automating delegation and task management. This consolidated, end-to-end approach helps ensure that all aspects of document declassification occur in accordance with legislation and in the time frame required.

Plus with the fact that digital content is growing in volume but also in “type” means that ECM systems in governments needs to automate more and as I have said in the past integrate policies to the underlying process flows; in order to be truly efficient.


Thoughts ?

3rd Platform and ECM for Government

The “Third Platform” and ECM for Government

Government agencies are adopting a range of new technologies—from mobile and cloud computing to big data analytics—in an effort to boost worker productivity, improve the quality of services they provide to citizens and reduce costs. This collection of new technologies, called the “third platform” by IDC, reflects a significant shift in IT from the first two platforms of mainframes and personal computers.

There’s little doubt that the third platform has the potential to provide significant benefits for governments and their citizens. But for many government IT groups, supporting these technologies will require a careful reassessment of enterprise content management (ECM) strategies and practices. Here are some things to consider when prepping for ECM use on the third platform.

Mobile computing
By enabling mobile computing and supporting the bring-your-own-device (BYOD) trend (sometimes called “bring-your-own-disaster”), government agencies open new possibilities for remote work and anytime, anywhere access to information—while also cutting the capital costs of buying PCs. A court system could allow judges to access and mark up cloud-based case documents from a personal tablet at home, and then pick up their work on the same documents at the office the next day. Citizens could access up-to-the-minute public safety information from their smartphones.

To support these and other mobile computing scenarios, though, IT groups must find ways to manage the lifecycle of information across a wide variety of sources, and deliver trusted, up-to-date information to users’ preferred mobile devices. IT groups also need policies to help them maintain privacy, security and compliance while supporting greater user mobility. All of these practices must be coordinated in a data and content governance strategy—which as I talk to governments worldwide, is not being done.

Cloud computing
Whether on-premise, off-premise, private or public, governments are assessing or planning on using cloud computing to host information and applications. This approach does resolve some governance issues with interoperability, because with mobile computing in the cloud, it’s the data—not the device—that is king.

As public-sector organizations move to the cloud, they acquire the ability to consolidate metadata or classification requirements, which reduces the chance of duplication or loss of information. However, if proper metadata, classification or file plans are not enforced in the cloud, then departments or users risk losing information. A badly planned cloud deployment in content management will turn out to be just another shared drive that no one has control over.
Finally, if government departments use public cloud infrastructures, who will ensure the enforcement of policy and compliance requirements for records management or content management?

Big data analytics
To capitalize on the full potential of big data, government agencies need new ways to efficiently extract and analyze data from unstructured sources. Up to 80 percent of information exists in unstructured form within case files, fraud investigation notes, historical contracts, social media feeds and other sources. Natural-language processing technologies and classification engines help agencies extract valuable information from those sources, which they can then feed into analytics systems to substantially enhance the value of analyses.

Analyzing unstructured data from social media can help intelligence agencies identify trends that might signal future hate crimes or terrorist acts. Caseworkers can draw on information previously buried in handwritten notes to make better-informed and more dynamic decisions.

The third platform of computing holds a lot of promise for improving productivity, service quality and efficiency. Better ways to capture, analyze, manage and govern content—supported by ECM strategies and best practices—will enable government agencies and their citizens to take full advantage of what these new technologies can offer. But don’t underestimate the importance of a strong governance strategy and plan, as well as a solid information lifecycle infrastructure.
What this means for ECM platforms or technology is that greater interoperability and functionality ( Text Analytics, Storage visualizations, Smart Archiving etc) will be needed to support the 3rd Platform

Bottom line: these new computing platforms and use cases create different modes of delivery of service and value which means again that policy and process in a government program needs tighter alignment

%d bloggers like this: