Executive Guide to Cognitive Computing Part 5

Recommended Strategic Planning and Considerations for Cognitive Computing Projects

In order to proceed one must start to define the use case (business problem) and more importantly what is the question or questions that need to be answered.  As mentioned earlier cognitive computing is a new focus on knowledge or information automation vs. Process automation that we are familiar with in traditional technology systems.

Defining the intention of the project?

This may seem apparent but organization can get caught up in the excitement surrounding cognitive computing and lose sight of what type of problem needs to be solved or resolved.

1. Type of questions/information to be asked or analyzed?

2. What type of dialog/conversation to be supported?

3. Integration with pre-existing systems?

4. Number of potential users?

5. Corpora or Corpus Size (Type of Data and # of Documents) and level of complexity ?

Defining the Objective

Having a clear idea of the outcome of the project is the next step.  This ensures that the proper cognitive technology is applied.  Are you analyzing text? Are you trying to assist a call centre agent   Are you attempting to automate a question and answer conversation?  The objective must be clear:

  • What type of problem is trying to solve?
  • Who will be the user?
  • Are there multiple types of users? and what are their expectations?
  • What issues will your users be interested in?
  • What do they need to know?
  • Will they need to answer questions like How and Why?
  • What is the objective based on knowledge and data? (vs. process)
  • Type of knowledge that will be pivotal to corpus? (segment of domain or business/industry)
  • Will the system be needed to provide assistance? (Citizen or Agent)

 

Defining the Domain to analyze:

  • Helps to identity data sources as well as SMEs that will need to be involved
  • Can the objective narrow the domain focus?
  • Is there domain taxonomies, ontologies and catalogues?
  • Have you identified additional data sources not typically associated with solving problems in that domain? (i.e. learned by experience)

Recommended Next Steps on your Cognitive Computing Journey?

Assess the Cognitive Computing Maturity of the organization

Knowing where you are today vs. where you want the organization to be in the future is critical.  Identifying gaps and priorities in how and where Cognitive Computing can be used provides a clear idea on what you have invested in the past and what you will need to invest in the future.  A Cognitive Computing maturity assessment is a quick way to start to understand the level of effort required.  As shown in the illustration it can quickly guide decisions in planning and investment in this new technology.

 Cognitive MM

Figure 2: Cognitive Computing Maturity Model Assessment Example

Engage with IT and IM

Since cognitive systems will rely on accessing more forms of information: text, pictures, voice, sensor, geo-special and traditional sources; business and IT/IM must work together.  Although business may lead the initiative where all that information will be coming from will require a broader more cross-functional team to help develop domains of knowledge.  Include organizations that manage the ECM environments as well as other forms of information.  Understand where Open Data is coming from and what are the governance issues associated with combining many forms of information together.  With IT/IM and the lines of business working together the outcome of any project will be more successful.  However; it must be clear who is the ultimate owner of the project.

Cultural and Organization Readiness

Cognitive like Analytics or Big Data require a shift in the organization culture.  Once groups and individuals learn that they can ask complicated questions and get answers in a shorter period of time then receptivity to a new innovate technology will be positive.

Never belittle the importance of organizational change and fundamental training for the new projects or programs.  The organizational change management plan must be aligned to the domain and objective of the cognitive project.

Prioritize on Business Problem to resolve

Cognitive systems are predicated on solving specific problem.  It is critical that a clear business problem is identified.  Typically, it starts with a question or area that needs to be better understood.  Traditional approaches of IT defining the solution for the problem may not necessarily work.  What is needed is a cross-functional planning team that sees the business problem to be solved from many directions.  This team needs to have executive sponsorship and participation as well as multiple lines of business.  This is due to the fact that cognitive systems rely on more sources of information that multiple lines of business may have.  IT will be required to prioritize and plan on accessing the sources of information (data sets).

Identify Key Questions to answer (How? and Why?)

Traditional analytics technologies have been able to answer questions like What? Where? and When?  However, cognitive systems can answer those type of questions as well as How? And Why?  In order to answer those questions all there has to be a coordinated plan and team that can help to define the questions then start to assess where the information is in order to answer those questions.

Rapid Proof of Technology Exercise

In order to quickly assess the capability of Cognitive Computing a proof of technology session should be planned.  This encompasses a limited scope workshop that tests the viability of answering a question.  It is strongly recommended that this approach is done since it addresses all the points raised above.

Skill Development

Lastly, since Cognitive Systems require a new method of programming (algorithmic) skills may have to be developed and enhanced within the IT organization as well as the lines of business.  This ties in with cultural change.  The dynamics of Cognitive Systems are different from traditional IT systems and project planning.  Without proper skill development any system will not succeed.

Executive Guide to Cognitive Computing Part 4

Five Dimensions that Cognitive Computing will evolve in Public Sector programs

As Cognitive Systems get deployed into Public Sector programs they will continue to evolve over time due to the nature of algorithmic programming and natural language processing of information.  Cognitive systems are dependent upon a feedback loop and as such these dimensions will have an impact on not only the technology but the programs or projects they support.  This evolution is predicated on a few dimensions that affect perception and adoption.

 Cognitive Evolution

Figure 1: Five Dimensions of how Cognitive Computing will evolve over time

 

Cognitive Computing will evolve over 5 dimensions that span both technology and cultural aspects of a public sector program.

Personalized Interactions

Given the ability for cognitive to interact via natural language and learn from those interactions each business problem may require varying levels of interaction and also the level of personalization.  Therefore, a social service benefit self-serve application will need to have a much more intimate understanding of the citizen.  Each person will need to be better understood on more dimensions of interaction: access rights, location, personality, tone, sentiment, log history etc.  All of these dimensions will provide a more satisfying interaction and outcome regardless of the user type.

Learning

As has been explained cognitive systems relying heavily on machine learning.  Machine learning algorithms can be supervised or unsupervised.  It will come down to what level of complexity and knowledge the algorithms has of a specific domain.  Some applications will constantly need the input of a human subject matter expert in order learn whereas other systems will continue to enhance themselves through automated feedback loops.

Recommendation: Develop a culture where analysis and question asking is supporting in that cognitive systems will aid in decision making outcomes.  Also the need to have SME within a domain or program area to participate in helping the cognitive systems learn.  This will impact workforce dynamics and must be positioned correctly so that users do need feel threatened by the new systems being developed.

Sensing

Since Cognitive system analyze larger data sets and require more dimensions of data one path that cognitive systems will evolve around will be the number of data sets used to answer questions or make decisions upon.  One will see the expansion of various sources of information types to ‘sense’ and decide upon.  The Internet of Things, Dirty Data, Big Data, Open Data, Geo-Spatial and Social Media information sources provide greater contextual understanding for cognitive systems to integrate with and additionally enhance their analysis capability.

Recommendations: Policy changes will be challenged to address the evolution of information access in accordance with public sector regulations and compliance.  Planning and strategy will be required in order for this evolution to occur

Ubiquity

Public Sector workers are younger and are technically ‘rich’ in their personal lives.  The expectation of technology use will increase exponentially as the public sector workforce changes.  The need to embed cognitive systems in how people work and where they work regardless of device or location will need to be planned for.

Recommendations: Since cognitive systems focus on information automation vs. process automation and the information can be presented or integrated in any form the ubiquity of the interplay between systems and users can be supported to meet the demand of the new public sector workforce.

Scalability

The ability to interact with government workers or citizens will continue to enhanced with the continuing development of natural language and conversation algorithms which will ensure that the interaction between user and technology becomes easier over time.

Cognitive systems continue to enhance themselves through artificial intelligence and machine learning.  The evolution of feedback loops and deep neural networks will ensure that developed algorithms will be enhanced in a more automatic fashion that the system can truly learn how to interact and better respond with increased levels of confidence and information.

Cognitive systems are really a culmination of existing systems and algorithmic programming as well as the need to incorporate more and larger data sets such as geospatial, weather, social media and internet of things data (IoT).

Recommendations:

Public Sector leaders must plan for the fact that in order to scale systems one must be reliant on technology on premise as well as in the Cloud or with a Platform as a Service (PaaS). 

How do I use Cognitive Computing and for what benefit?

To truly understand what and how cognitive systems will benefit your organization it is best to see how other public sector programs are starting to use cognitive machine learning and natural language processing to enhance programs.

Tax

A large taxation department is using advanced methods of natural language processing to analyze structured information (SWIFT Transactions) and unstructured information such as social media and addresses to investigate and analyze off-shore financial transactions.

Health Agency

A national health agency that is tasked with researching and assessing immunization products for the country they serve are looking at the possibility of what natural language processing and text analytics will allow them to do.  Instead of manually reading thousands of medical journals and research documents by hand; which currently takes 10 months they hope to be able to analyze and extract insight within a shorter period of time thereby getting more effective immunization treatments to the populace in a shorter period of time.

Law Enforcement

Investigations into major crimes and drug gang activity is being enhanced by combining many sources of information together and allowing investigators to ask in natural language who someone is or even where they are.   This information is then fed into visualization tools to better see how individuals and organizations are linked.  Advance analysis tools are being used to image analysis and extraction of meaningful evidence that could only have been done by a human before.  This allows a larger body of evidence to be gathered and because it is automated the change of evidence or forensics can be maintained as it is handed over to prosecution.

Challenges and Opportunities for Cognitive Computing in Public Sector

Challenges to Cognitive Computing in Public Sector

The challenge in today’s world of improving or innovating government programs is that we have a broad array of information and process automation to co-ordinate.   Another major challenge of being in a data driven world is that information can be wrong, false, incorrect, out of date or inaccessible.  Cognitive Computing with its ability to apply algorithmic programming allows advanced patterns to be identified out of a much larger group data sets; which allows us to reduce the “noise” associated with making decisions and program outcomes.

There is also a need for evidence based decision making; which needs to follow a prescribed methodology.  As well as the need to analyze larger bodies of knowledge and information.  Traditional rules and equation based programming cannot manage or interact with a human in natural language.  Cognitive Computing interacts with a government worker with natural language and with the ability to learn and enhance the algorithms needed to find and test hypothesis or questions.

Government must ensure information is secured and managed effective.  One of the challenges of moving to cloud based computing or sharing data sets of information across government raises the issue of how far can the data or the information move from where is was created.  Information provenance and governance practices must be in place but the need for private cloud or platform as a service cognitive computing service catalogs are needed to ensure the data is kept within the boundaries of how it is to be governed and managed.  Data residency is another issue associated with using cloud services or cloud computing; however most major providers of cloud services have data centres within the country thereby offsetting issues associated with Data Residency.

Recommendations:

Most governments around the world today have a shared services model for core ICT and Enterprise applications support.  We are seeing that government are now looking at cloud brokerage services being managed within the government which deals with Data Gravity issues.  And based on the nature of the API Economy we see that PaaS (Platform as a Service) are now being investigated and tested.   Therefore, we see central shared services agencies being the agent of change and will look to them to deploy Cognitive Computing PaaS as a service catalog that other government agencies and projects can leverage which will then ensure information is secure and protected depending on the type of information.

Opportunity to Innovate in Government Programs

Due to ability of cognitive computing to identify patterns or information at high speed and with large sets of information the opportunities in government are broad.  Since all information sources are able to be analyzed and combined (Databases and text etc.) a more complete picture is provided to an individual to make decisions.

Any area within a government program that houses a large set of information relevant to a specific domain: benefits, policy, regulations etc. Would benefit from cognitive computing since this information can be analyzed as well as added to a corpus of knowledge that the machine learning algorithms can access and analyze across dimensions such as time or relevance etc.

Six forces that will impact the future evolution of cognitive computing in Public Sector. 

Each facet has its own issues and challenges for this technology to be adopted.

Society

  • Tremendous demand for more intelligent machines and access through mobile devices can facilitate familiarity and comfort with technology
  • Fears of privacy breaches and machines taking human jobs could be a deterrent

Perception

  • Perceptions and expectations must be well managed
  • Unrealistic perceptions of risk and expectations could lead to a third “Artificial Intelligence AI Winter”

Policy

  • Wider adoption will require the modifying policies (e.g., data sharing) and creating new policies (e.g., decision traceability)
  • Fear, uncertainty and doubt may be addressed by new policies (e.g., data security & privacy)

Technology

  • Advanced, intelligent devices will enable a greater understanding of entity context and contribute to the robustness of available information corpora
  • Greater scalability needs drive new architectures and paradigms

Information

  • Variety and scalability capabilities of future systems will advance rapidly to cope with information exhaust
  • Information explosion could advance evolution and adoption rates

Skills

  • Cognitive computing demands unique skills such as natural language processing, machine learning
  • Greater availability of key skills will be key in the evolution and adoption of the capability

Recommendations: One must co-ordinate a strategy that revolves around the areas discussed above.  The fundamental challenges similar to cloud based computing in pubic sector will be policy and cultural change that needs to be managed in order for the information and technology to develop.

What is Cognitive Computing and Why Should Program Executive Care ?

What makes up Cognitive Computing

Cognitive Computing is comprised of three main functional areas.  Which are: natural language processing, machine learning and hypotheses testing.  All three functions of cognitive computing combine to provide greater flexibility.  This helps address a broader array of business problems in public sector. Business problems that could not have been solved earlier.  Natural Language Processing enables machine learning and discovery algorithms to interact with the end user in a more meaningful way.

Natural Language Processing (NLP)

            NLP describes a set of linguistic, statistical, and machine learning techniques that allow text to be analyzed. Which allows key information extraction for business value.  Natural Language analysis uses a pipeline processing approach.  Wherein the question or text is broken apart by algorithms.  So that the structure, intent, tone etc. is understood.  Specific domains of knowledge; such as legal, finance or social services, require targetted “dictionaries” or filters.  This helps to further improve the ability of the technology to understand what is being asked of it.

Some of the key benefits of NLP is it improves the interaction between human and systems.  Some additional benefits from NLP are as follows:

A questions contextual understanding can be derived from NLP. IT organizations develop a meta-data (information about information) strategy.  This gives more context to data and information sources.  The more meta-data and the more context added to a system; the better the understanding. This allows the improvement of finding information and then providing an answer back in natural language.  Instead of a page ranking result one would get from a typical search engine the response is in a form the user will understand.

The intent of a question is then better understood. Which means the cognitive system can better respond with a more meaningful response.  As well as return various responses with an associated confidence level.  This then gives the end user a more meaningful response with which to make a decision upon.

Natural Language Processing has taken interaction and access to information to a whole new level.  Which will in turn provide increased productivity and satisfaction to the end user.

Machine Learning

Machine Learning is all about using algorithms to help streamline organization, prediction and pattern recognition.   Big Data by itself can be daunting and only data scientists can build and interpret analysis by incorporating machine learning and natural language processing Big Data can only benefit from being easier to interpret by a broader user group.  Part of Machine Learnings “secret sauce” is deep neural networks to do information pattern analysis.

Deep neural networks, due to their multi-dimensional view can learn from regularities across layers of information which allows the machine learning algorithm to self-enhance its model and analysis parameters. This capability takes the onus away from the end user or data scientist to mine information from multiple sources.

The benefit to public sector programs is that deep domain knowledge will not be needed by the end user or the citizenry but have the machine learning algorithms to the heavy lifting and analysis for them.   

Historically organizations had to depend on limited ways to analyze and report on information which to a certain degree limited decision making and program outcomes.  Now with machine learning a key benefit is to access these systems with natural language or extremely flexible visualization tools which make decision making easier and more productive.  Since Cognitive systems are about knowledge automation vs. process automation.

Hypotheses Testing

A hypothesis is a proposed answer to pre-existing or understood responses.  From there a cognitive application will use the information that resides within a certain corpus or domain of knowledge to test the hypothesis.  Unlike humans who typically test hypothesis in a serial fashion one of the key benefits of a cognitive system is that it can test hundreds of hypothesis in parallel.  We see this occurring in areas such as health care or intelligence where various proposed outcomes are tested against a domain of knowledge.  The domain of knowledge can be comprised of many different sources and types of information. Given that cognitive systems have the ability to test large volumes of hypothesis at a high volume.  Programs and applications can benefit from the ability to provide improved means to make decisions with confidence and also to remove the “noise” surrounding what is trying to be resolved.  Some benefits from hypothesis testing are:

Causal Induction provides a key benefit to the user since it is based on statistical models deriving insight from a corpus or domain of knowledge.  As these models become more refined the ability to derive insightful responses provide more meaningful interactions with the end user or citizen.

Probabilistic Reasoning can generate multiple responses to a question which provides the user to see all aspects of an outcome versus generating a specific bias to the problem at hand.  This predicated on the system having enough context and also arriving at a specific level of confidence to provide an answer to the question.  As systems learn through interaction and feedback they will be able to identify if information is missing in order to provide an answer which again enhances the decision making process of a project or program

In summary; Cognitive Systems combine natural language processing with advanced algorithms and modelling tools to aid workers to make decisions in a shorter period of time and/or to provide more meaningful insight to a larger domain/corpus of information which the end user would never have been able to access or analyze prior to Cognitive Computing technologies

An Executives Guide to Cognitive Computing Part 1

What you need to know about Cognitive Computing Part 1

In the coming weeks I will lay out an explanation and recommendation on what and how Cognitive Computing can be used.  As always I look forward to comments and thoughts raised.

Technology continues to develop and improve in how we interact with systems and people.  Over the past decade there has been radical developments in how computers make sense of:

·      Text

·      Voice

·      Pictures

Given these capabilities; what will this mean for your agency or program?  The possibilities are endless when one considers the value of integrating text, voice and pictures into the decision making process.   The purpose of the discussion is to investigate the “art of the possible” and to provide an overview of Cognitive Computing. We will conclude with a focus on approaches and recommended next steps.

This new shift in technology provides the ability to automate knowledge and decision making in a more meaningful way.  The challenge of policy change needs to occur to reap the benefits of this new technology.  Cognitive Computing will face some challenges over the next few years.  It will evolve over time as it used by public sector organizations.

Public Sector organizations around the world have too much information to analyze. Traditional Information Technology (IT) systems cannot cope with variety or volume.  Each department or program needs access to more sources of information. The more “dimensions” of information accessed (e.g. geo-spatial, social media, weather data) the better the outcome.

In today’s world employees and programs are asking harder questions through traditional means. As an example; one government agency is trying to analyze the dynamics of the underground economy.  A state government is attempting to understand why municipalities dissolve or succeed.  It is important to apply critical thinking principles via cognitive computing.   There is a need to eliminate the “noise” that exists in today’s information driven society.  Organizations can now use machine learning and hypothesis testing to apply critical thinking against larger sources of all types of information.

Recommendations of this discussion will focus on ensuring a clear objective. And, that the domain of knowledge is complete and defined.  This ensures that the cognitive systems will work successfully to solve complex business problems.

Thoughts ?

Dragnet past and future

When driving home at nights I have found that Spotify can stream old time radio shows into my car.  As I gleefully tapped on one of the first episodes of Dragnet; broadcast in 1949.  I was surprised and amused to hear Sergeant Joe Friday discuss putting his suspect’s profile and MO into “the IBM machine” and amazed at what the punch card interface would tell him !  Back then IBM had been helping law enforcement analyze profiles of one suspect against large data sets to better their investigation.

I now look at todays world with the volume of data and the various formats it comes in and how IBM is still providing the abilities of modern day Joe Friday’s to do their job better.  Obviously I cannot discuss or illustrate some really cool examples of what IBM is doing but suffice it to say that the innovation IBM did in the 40s all the way up today; has allows been to answer specific questions for clients in the most meaningful way.

We see technology helping law enforcement in many ways.  Prediciting “hot” spots in cities so that Police can better manage their force and community policing practices.  Using advanced analytics for gang behaviours etc.  The list goes on and on.

I was in a meeting this morning where again IBM can take pride in the sheer number of mathematicians and researchers that constantly are striving to answer hard questions and to engage with customers in a constructive way.  

Just wanted to share a little observation of what a companies longevity and perspective can give.

IBM supporting SPARK

Working with our Government clients worldwide I have seen great innovative Big Data projects.  As data volume and algorythms get more complex the need to manage those interactions have put a strain on the existing technologies.  Today IBM announced support for SPARK.  This is a very exciting announcement and I encourage you to learn more about what this means for the world of Analytics and Big Data.

To summarize the announcement IBM will be building Spark into the core of our analytics and commerce platforms.

Open sourcing our IBM SystemML machine learning technology and collaborating with Databricks to advance Spark’s machine learning capabilities.

IBM will be offering Spark as a service on IBM Bluemix to enable app developers to quickly load and model data.

Opening a Spark Technology Center in San Francisco for the Data Science and Developer community to be at the center of Spark innovation, collaborating closely with AMPLab and Databricks.

Educating at least 1 million data scientists and data engineers on Spark through partnerships with AMPLab, DataCamp, MetiStream, and Big Data University MOOC

This represents a commitment of more than 3,500 researchers and developers working on Spark-related projects at more than a dozen labs worldwide.
Follow @IBMBigData and @IBMAnalytics on Twitter

Watch the Livestream on IBM.com at IBM.com/spark on June 15th at 7 pm PDT. There will also be a replay. 

Visit the Spark Summit Website

Big Data Reality

When working with government clients it amazes me the broad description and ideas that are included in talking about big data.  The biggest hurdle to these types of projects is that IT or a program lead has an idea for data analysis but culturally the agency or program is not thinking about what Big Data could answer for them.

One area in a business case that is attempting to justify a Big Data or Analytics project that does not get a lot of attention is the organizational change management ( this is true for all IT projects I find ).  So how do you instill a culture where end users and executive

What I see are proof of concepts that try to pick an area that will then instil curiousity across the various business units.  So they can see what is possible.  Or if they have clear data sets then asking what else can we ask of them.

Another challenge I see is that most organizations have skill sets and knowledge as it applies to business intelligence tools or other analytics; however when you start to look at what a data scientist does and the algorithms they develop it becomes apparent that most organizations are lacking in the proper skills to implement a Big Data project.

Currently we are looking at helping government organizations by “hooking” them up with data scientist in order for them to play with data sets and try to answer some complicated questions: like workforce development or underground economics.  Predicting and understanding so that programs and legislation are more effective at their mandates.

Back to the Future

I know I have not posted in quite some time.  A few changes have occured in that my responsiblity at work as broadened to include more of the analytics offerings from IBM.  This move is exciting and has opened my eyes to the great things that governments are doing worldwide in the areas of Big Data and Analytics.

What I do see are government agencies still wrestling with Big Data.  What actually is it ? and what can I do with it ?  We are actively helping agencies see what is possible with Big Data by aligining Data Scientists with best practices so that agencies actually get to see the art of the possible.

The other observation I have made is that due to my ECM background I am sensitive to information lifecycle goverance (ILG) or compliance issues.   Some agencies when looking at Big Data just think they can throw data sets up into areas such as Hadoop and a lot of this information is unstructured still and without proper metadata managemnet these Big Data projects are putting the agency at risk for loss or improper storage of the information they are attempting to use.

I guess it comes down to fundamentals still.  Regardless of the innovative use of data etc.  The basics of information governance still rule supreme.

Stay tuned for my next post around Cyber Security and threat prediciton.

Internet of Things (IoT) – Designing Pri

Internet of Things (IoT) – Designing Privacy and Security Into Devices | The National Law Review: IBM Edition http://ow.ly/K3Zhc

%d bloggers like this: