Challenges and Opportunities for Cognitive Computing in Public Sector

Challenges to Cognitive Computing in Public Sector

The challenge in today’s world of improving or innovating government programs is that we have a broad array of information and process automation to co-ordinate.   Another major challenge of being in a data driven world is that information can be wrong, false, incorrect, out of date or inaccessible.  Cognitive Computing with its ability to apply algorithmic programming allows advanced patterns to be identified out of a much larger group data sets; which allows us to reduce the “noise” associated with making decisions and program outcomes.

There is also a need for evidence based decision making; which needs to follow a prescribed methodology.  As well as the need to analyze larger bodies of knowledge and information.  Traditional rules and equation based programming cannot manage or interact with a human in natural language.  Cognitive Computing interacts with a government worker with natural language and with the ability to learn and enhance the algorithms needed to find and test hypothesis or questions.

Government must ensure information is secured and managed effective.  One of the challenges of moving to cloud based computing or sharing data sets of information across government raises the issue of how far can the data or the information move from where is was created.  Information provenance and governance practices must be in place but the need for private cloud or platform as a service cognitive computing service catalogs are needed to ensure the data is kept within the boundaries of how it is to be governed and managed.  Data residency is another issue associated with using cloud services or cloud computing; however most major providers of cloud services have data centres within the country thereby offsetting issues associated with Data Residency.

Recommendations:

Most governments around the world today have a shared services model for core ICT and Enterprise applications support.  We are seeing that government are now looking at cloud brokerage services being managed within the government which deals with Data Gravity issues.  And based on the nature of the API Economy we see that PaaS (Platform as a Service) are now being investigated and tested.   Therefore, we see central shared services agencies being the agent of change and will look to them to deploy Cognitive Computing PaaS as a service catalog that other government agencies and projects can leverage which will then ensure information is secure and protected depending on the type of information.

Opportunity to Innovate in Government Programs

Due to ability of cognitive computing to identify patterns or information at high speed and with large sets of information the opportunities in government are broad.  Since all information sources are able to be analyzed and combined (Databases and text etc.) a more complete picture is provided to an individual to make decisions.

Any area within a government program that houses a large set of information relevant to a specific domain: benefits, policy, regulations etc. Would benefit from cognitive computing since this information can be analyzed as well as added to a corpus of knowledge that the machine learning algorithms can access and analyze across dimensions such as time or relevance etc.

Six forces that will impact the future evolution of cognitive computing in Public Sector. 

Each facet has its own issues and challenges for this technology to be adopted.

Society

  • Tremendous demand for more intelligent machines and access through mobile devices can facilitate familiarity and comfort with technology
  • Fears of privacy breaches and machines taking human jobs could be a deterrent

Perception

  • Perceptions and expectations must be well managed
  • Unrealistic perceptions of risk and expectations could lead to a third “Artificial Intelligence AI Winter”

Policy

  • Wider adoption will require the modifying policies (e.g., data sharing) and creating new policies (e.g., decision traceability)
  • Fear, uncertainty and doubt may be addressed by new policies (e.g., data security & privacy)

Technology

  • Advanced, intelligent devices will enable a greater understanding of entity context and contribute to the robustness of available information corpora
  • Greater scalability needs drive new architectures and paradigms

Information

  • Variety and scalability capabilities of future systems will advance rapidly to cope with information exhaust
  • Information explosion could advance evolution and adoption rates

Skills

  • Cognitive computing demands unique skills such as natural language processing, machine learning
  • Greater availability of key skills will be key in the evolution and adoption of the capability

Recommendations: One must co-ordinate a strategy that revolves around the areas discussed above.  The fundamental challenges similar to cloud based computing in pubic sector will be policy and cultural change that needs to be managed in order for the information and technology to develop.

What is Cognitive Computing and Why Should Program Executive Care ?

What makes up Cognitive Computing

Cognitive Computing is comprised of three main functional areas.  Which are: natural language processing, machine learning and hypotheses testing.  All three functions of cognitive computing combine to provide greater flexibility.  This helps address a broader array of business problems in public sector. Business problems that could not have been solved earlier.  Natural Language Processing enables machine learning and discovery algorithms to interact with the end user in a more meaningful way.

Natural Language Processing (NLP)

            NLP describes a set of linguistic, statistical, and machine learning techniques that allow text to be analyzed. Which allows key information extraction for business value.  Natural Language analysis uses a pipeline processing approach.  Wherein the question or text is broken apart by algorithms.  So that the structure, intent, tone etc. is understood.  Specific domains of knowledge; such as legal, finance or social services, require targetted “dictionaries” or filters.  This helps to further improve the ability of the technology to understand what is being asked of it.

Some of the key benefits of NLP is it improves the interaction between human and systems.  Some additional benefits from NLP are as follows:

A questions contextual understanding can be derived from NLP. IT organizations develop a meta-data (information about information) strategy.  This gives more context to data and information sources.  The more meta-data and the more context added to a system; the better the understanding. This allows the improvement of finding information and then providing an answer back in natural language.  Instead of a page ranking result one would get from a typical search engine the response is in a form the user will understand.

The intent of a question is then better understood. Which means the cognitive system can better respond with a more meaningful response.  As well as return various responses with an associated confidence level.  This then gives the end user a more meaningful response with which to make a decision upon.

Natural Language Processing has taken interaction and access to information to a whole new level.  Which will in turn provide increased productivity and satisfaction to the end user.

Machine Learning

Machine Learning is all about using algorithms to help streamline organization, prediction and pattern recognition.   Big Data by itself can be daunting and only data scientists can build and interpret analysis by incorporating machine learning and natural language processing Big Data can only benefit from being easier to interpret by a broader user group.  Part of Machine Learnings “secret sauce” is deep neural networks to do information pattern analysis.

Deep neural networks, due to their multi-dimensional view can learn from regularities across layers of information which allows the machine learning algorithm to self-enhance its model and analysis parameters. This capability takes the onus away from the end user or data scientist to mine information from multiple sources.

The benefit to public sector programs is that deep domain knowledge will not be needed by the end user or the citizenry but have the machine learning algorithms to the heavy lifting and analysis for them.   

Historically organizations had to depend on limited ways to analyze and report on information which to a certain degree limited decision making and program outcomes.  Now with machine learning a key benefit is to access these systems with natural language or extremely flexible visualization tools which make decision making easier and more productive.  Since Cognitive systems are about knowledge automation vs. process automation.

Hypotheses Testing

A hypothesis is a proposed answer to pre-existing or understood responses.  From there a cognitive application will use the information that resides within a certain corpus or domain of knowledge to test the hypothesis.  Unlike humans who typically test hypothesis in a serial fashion one of the key benefits of a cognitive system is that it can test hundreds of hypothesis in parallel.  We see this occurring in areas such as health care or intelligence where various proposed outcomes are tested against a domain of knowledge.  The domain of knowledge can be comprised of many different sources and types of information. Given that cognitive systems have the ability to test large volumes of hypothesis at a high volume.  Programs and applications can benefit from the ability to provide improved means to make decisions with confidence and also to remove the “noise” surrounding what is trying to be resolved.  Some benefits from hypothesis testing are:

Causal Induction provides a key benefit to the user since it is based on statistical models deriving insight from a corpus or domain of knowledge.  As these models become more refined the ability to derive insightful responses provide more meaningful interactions with the end user or citizen.

Probabilistic Reasoning can generate multiple responses to a question which provides the user to see all aspects of an outcome versus generating a specific bias to the problem at hand.  This predicated on the system having enough context and also arriving at a specific level of confidence to provide an answer to the question.  As systems learn through interaction and feedback they will be able to identify if information is missing in order to provide an answer which again enhances the decision making process of a project or program

In summary; Cognitive Systems combine natural language processing with advanced algorithms and modelling tools to aid workers to make decisions in a shorter period of time and/or to provide more meaningful insight to a larger domain/corpus of information which the end user would never have been able to access or analyze prior to Cognitive Computing technologies

Contextual Computing

Contextual Computing unlocking the power of enterprise data Infographic

Update and Observations May 2014

I have been meeting with government and healthcare organizations over the past few months and it apparent that there is a general frustration with existing work based technology and applications.  I have discussed this earlier; that we are technically “rich” in our personal lives and technically “poor at work.

ERP, HRMS and CRM systems that have been deployed over the past 20 years had inherent user interface issues (i.e. they sucked).  We have done Business Value Assessments in the past where we found end users printing screen shots of the work they did in an ERP system because they did not trust the technology and feared being audited: that is fundamentally bad when you are supposedly using technology and reverting back to manual processes.

So with your personal life filled with smartphones, tablets, smart televisions, smart thermostats that talk to your smartphone etc.; and the “Appification” of everything where you the consumer have a broad choice of applications, data and platforms to interact with said software and content your expectations rise rapidly.

So when you show up to work on a Monday and look at multiple login screens to archaic and dysfunctional technology you get depressed (which I would argue affects your productivity ).  You have diligently documented how to get work done via the “Cheat Sheets” at your cubicle or work area and you plod along.  Juxtaposed with what is on your smartphone on your hip vs. what you look at on your workstation.  Things need to change you need to be able to have a richer work experience.

So, as  mentioned my discussions with clients in all areas of public sector have been interesting.  They want to be more effective in getting work done with technology that provides what they want when they want and on the platform of their choice.  So over the next few posts I will be covering:

Work Optimization – Think about what method you use to get work done.  Task Outcome or Time Sensitive work are done differently and there is always a process or workflow productive people stick to; so understanding how people work becomes more important in deploying technology.  

Open Data – What does this mean and how does ECM fit into this.  Since with Open Data Sets organizations have information in a format that can be repurposed and presented for constituents to be better informed or government agencies to enhance their decision making process in getting work done.

Contextual Computing – IBM has done a recent study on this new developing area and how it helps people work better with better decision making etc.

As always I look forward to questions of discussion.

Context 3.0

Context 3.0  

 

I did a search on Context 1.0 and 2.0 it appears to have been a project around creating context out of information.  I just wanted to introduce this subject and the idea.  IBM ECM talks about content in motion; how applications/solutions provide value to an organization by leveraging the foundational layer of ECM architecture.  

So given technology is at a highly convergent point in time; in that we GPS, Social, Analytics, Predictive Analytics, structured data and unstructured information.  How do we get information in context which always helps decision making and knowledge sharing.  So keep coming back as I attempt to analyze how the next big thing could be contextual information management.

%d bloggers like this: