Pomodoro and Productivity Tools – Why not more integration

I have taken a short break from building my AI Bot due to work.  Sometimes priorities take over.  Which led me to wanting to make a quick note.

 I have been using many productivity tools over the years: currently I use OmniFocus more than others.  I have tried Things 3, Todoist and other apps.  All very good and do things somewhat differently.

I am also waiting to see what happens with OmniFocus 3 this year.

I have also tried many Pomodoro technique apps on my computer or iPhone or iPad.  I love the way it helps to focus and chunk your time.

The question I have is why are the two practices not more tightly integrated.  I would hope that if I had a task in OmniFocus I was working on ( whether i allocate time slices or not to it ) that I could initiate a Pomodoro time window to work on the task.

All of the current productivity tools are lacking this integration point.  I do realize many people may not use the Pomodoro technique to do work but I would think that there is a large group of people wanting this functionality.

There have been attempts to integrate through things like Vitamin-R and other apps, but it is awkward and not tightly integrated in a seamless way.

I am still hopeful and holding back purchases until someone steps forward to do this integration of GTD and Pomodoro to allow us to be more productive in an active direct way.

Executive Guide to Cognitive Computing Part 5

Recommended Strategic Planning and Considerations for Cognitive Computing Projects

In order to proceed one must start to define the use case (business problem) and more importantly what is the question or questions that need to be answered.  As mentioned earlier cognitive computing is a new focus on knowledge or information automation vs. Process automation that we are familiar with in traditional technology systems.

Defining the intention of the project?

This may seem apparent but organization can get caught up in the excitement surrounding cognitive computing and lose sight of what type of problem needs to be solved or resolved.

1. Type of questions/information to be asked or analyzed?

2. What type of dialog/conversation to be supported?

3. Integration with pre-existing systems?

4. Number of potential users?

5. Corpora or Corpus Size (Type of Data and # of Documents) and level of complexity ?

Defining the Objective

Having a clear idea of the outcome of the project is the next step.  This ensures that the proper cognitive technology is applied.  Are you analyzing text? Are you trying to assist a call centre agent   Are you attempting to automate a question and answer conversation?  The objective must be clear:

  • What type of problem is trying to solve?
  • Who will be the user?
  • Are there multiple types of users? and what are their expectations?
  • What issues will your users be interested in?
  • What do they need to know?
  • Will they need to answer questions like How and Why?
  • What is the objective based on knowledge and data? (vs. process)
  • Type of knowledge that will be pivotal to corpus? (segment of domain or business/industry)
  • Will the system be needed to provide assistance? (Citizen or Agent)

 

Defining the Domain to analyze:

  • Helps to identity data sources as well as SMEs that will need to be involved
  • Can the objective narrow the domain focus?
  • Is there domain taxonomies, ontologies and catalogues?
  • Have you identified additional data sources not typically associated with solving problems in that domain? (i.e. learned by experience)

Recommended Next Steps on your Cognitive Computing Journey?

Assess the Cognitive Computing Maturity of the organization

Knowing where you are today vs. where you want the organization to be in the future is critical.  Identifying gaps and priorities in how and where Cognitive Computing can be used provides a clear idea on what you have invested in the past and what you will need to invest in the future.  A Cognitive Computing maturity assessment is a quick way to start to understand the level of effort required.  As shown in the illustration it can quickly guide decisions in planning and investment in this new technology.

 Cognitive MM

Figure 2: Cognitive Computing Maturity Model Assessment Example

Engage with IT and IM

Since cognitive systems will rely on accessing more forms of information: text, pictures, voice, sensor, geo-special and traditional sources; business and IT/IM must work together.  Although business may lead the initiative where all that information will be coming from will require a broader more cross-functional team to help develop domains of knowledge.  Include organizations that manage the ECM environments as well as other forms of information.  Understand where Open Data is coming from and what are the governance issues associated with combining many forms of information together.  With IT/IM and the lines of business working together the outcome of any project will be more successful.  However; it must be clear who is the ultimate owner of the project.

Cultural and Organization Readiness

Cognitive like Analytics or Big Data require a shift in the organization culture.  Once groups and individuals learn that they can ask complicated questions and get answers in a shorter period of time then receptivity to a new innovate technology will be positive.

Never belittle the importance of organizational change and fundamental training for the new projects or programs.  The organizational change management plan must be aligned to the domain and objective of the cognitive project.

Prioritize on Business Problem to resolve

Cognitive systems are predicated on solving specific problem.  It is critical that a clear business problem is identified.  Typically, it starts with a question or area that needs to be better understood.  Traditional approaches of IT defining the solution for the problem may not necessarily work.  What is needed is a cross-functional planning team that sees the business problem to be solved from many directions.  This team needs to have executive sponsorship and participation as well as multiple lines of business.  This is due to the fact that cognitive systems rely on more sources of information that multiple lines of business may have.  IT will be required to prioritize and plan on accessing the sources of information (data sets).

Identify Key Questions to answer (How? and Why?)

Traditional analytics technologies have been able to answer questions like What? Where? and When?  However, cognitive systems can answer those type of questions as well as How? And Why?  In order to answer those questions all there has to be a coordinated plan and team that can help to define the questions then start to assess where the information is in order to answer those questions.

Rapid Proof of Technology Exercise

In order to quickly assess the capability of Cognitive Computing a proof of technology session should be planned.  This encompasses a limited scope workshop that tests the viability of answering a question.  It is strongly recommended that this approach is done since it addresses all the points raised above.

Skill Development

Lastly, since Cognitive Systems require a new method of programming (algorithmic) skills may have to be developed and enhanced within the IT organization as well as the lines of business.  This ties in with cultural change.  The dynamics of Cognitive Systems are different from traditional IT systems and project planning.  Without proper skill development any system will not succeed.

Executive Guide to Cognitive Computing Part 4

Five Dimensions that Cognitive Computing will evolve in Public Sector programs

As Cognitive Systems get deployed into Public Sector programs they will continue to evolve over time due to the nature of algorithmic programming and natural language processing of information.  Cognitive systems are dependent upon a feedback loop and as such these dimensions will have an impact on not only the technology but the programs or projects they support.  This evolution is predicated on a few dimensions that affect perception and adoption.

 Cognitive Evolution

Figure 1: Five Dimensions of how Cognitive Computing will evolve over time

 

Cognitive Computing will evolve over 5 dimensions that span both technology and cultural aspects of a public sector program.

Personalized Interactions

Given the ability for cognitive to interact via natural language and learn from those interactions each business problem may require varying levels of interaction and also the level of personalization.  Therefore, a social service benefit self-serve application will need to have a much more intimate understanding of the citizen.  Each person will need to be better understood on more dimensions of interaction: access rights, location, personality, tone, sentiment, log history etc.  All of these dimensions will provide a more satisfying interaction and outcome regardless of the user type.

Learning

As has been explained cognitive systems relying heavily on machine learning.  Machine learning algorithms can be supervised or unsupervised.  It will come down to what level of complexity and knowledge the algorithms has of a specific domain.  Some applications will constantly need the input of a human subject matter expert in order learn whereas other systems will continue to enhance themselves through automated feedback loops.

Recommendation: Develop a culture where analysis and question asking is supporting in that cognitive systems will aid in decision making outcomes.  Also the need to have SME within a domain or program area to participate in helping the cognitive systems learn.  This will impact workforce dynamics and must be positioned correctly so that users do need feel threatened by the new systems being developed.

Sensing

Since Cognitive system analyze larger data sets and require more dimensions of data one path that cognitive systems will evolve around will be the number of data sets used to answer questions or make decisions upon.  One will see the expansion of various sources of information types to ‘sense’ and decide upon.  The Internet of Things, Dirty Data, Big Data, Open Data, Geo-Spatial and Social Media information sources provide greater contextual understanding for cognitive systems to integrate with and additionally enhance their analysis capability.

Recommendations: Policy changes will be challenged to address the evolution of information access in accordance with public sector regulations and compliance.  Planning and strategy will be required in order for this evolution to occur

Ubiquity

Public Sector workers are younger and are technically ‘rich’ in their personal lives.  The expectation of technology use will increase exponentially as the public sector workforce changes.  The need to embed cognitive systems in how people work and where they work regardless of device or location will need to be planned for.

Recommendations: Since cognitive systems focus on information automation vs. process automation and the information can be presented or integrated in any form the ubiquity of the interplay between systems and users can be supported to meet the demand of the new public sector workforce.

Scalability

The ability to interact with government workers or citizens will continue to enhanced with the continuing development of natural language and conversation algorithms which will ensure that the interaction between user and technology becomes easier over time.

Cognitive systems continue to enhance themselves through artificial intelligence and machine learning.  The evolution of feedback loops and deep neural networks will ensure that developed algorithms will be enhanced in a more automatic fashion that the system can truly learn how to interact and better respond with increased levels of confidence and information.

Cognitive systems are really a culmination of existing systems and algorithmic programming as well as the need to incorporate more and larger data sets such as geospatial, weather, social media and internet of things data (IoT).

Recommendations:

Public Sector leaders must plan for the fact that in order to scale systems one must be reliant on technology on premise as well as in the Cloud or with a Platform as a Service (PaaS). 

How do I use Cognitive Computing and for what benefit?

To truly understand what and how cognitive systems will benefit your organization it is best to see how other public sector programs are starting to use cognitive machine learning and natural language processing to enhance programs.

Tax

A large taxation department is using advanced methods of natural language processing to analyze structured information (SWIFT Transactions) and unstructured information such as social media and addresses to investigate and analyze off-shore financial transactions.

Health Agency

A national health agency that is tasked with researching and assessing immunization products for the country they serve are looking at the possibility of what natural language processing and text analytics will allow them to do.  Instead of manually reading thousands of medical journals and research documents by hand; which currently takes 10 months they hope to be able to analyze and extract insight within a shorter period of time thereby getting more effective immunization treatments to the populace in a shorter period of time.

Law Enforcement

Investigations into major crimes and drug gang activity is being enhanced by combining many sources of information together and allowing investigators to ask in natural language who someone is or even where they are.   This information is then fed into visualization tools to better see how individuals and organizations are linked.  Advance analysis tools are being used to image analysis and extraction of meaningful evidence that could only have been done by a human before.  This allows a larger body of evidence to be gathered and because it is automated the change of evidence or forensics can be maintained as it is handed over to prosecution.

Challenges and Opportunities for Cognitive Computing in Public Sector

Challenges to Cognitive Computing in Public Sector

The challenge in today’s world of improving or innovating government programs is that we have a broad array of information and process automation to co-ordinate.   Another major challenge of being in a data driven world is that information can be wrong, false, incorrect, out of date or inaccessible.  Cognitive Computing with its ability to apply algorithmic programming allows advanced patterns to be identified out of a much larger group data sets; which allows us to reduce the “noise” associated with making decisions and program outcomes.

There is also a need for evidence based decision making; which needs to follow a prescribed methodology.  As well as the need to analyze larger bodies of knowledge and information.  Traditional rules and equation based programming cannot manage or interact with a human in natural language.  Cognitive Computing interacts with a government worker with natural language and with the ability to learn and enhance the algorithms needed to find and test hypothesis or questions.

Government must ensure information is secured and managed effective.  One of the challenges of moving to cloud based computing or sharing data sets of information across government raises the issue of how far can the data or the information move from where is was created.  Information provenance and governance practices must be in place but the need for private cloud or platform as a service cognitive computing service catalogs are needed to ensure the data is kept within the boundaries of how it is to be governed and managed.  Data residency is another issue associated with using cloud services or cloud computing; however most major providers of cloud services have data centres within the country thereby offsetting issues associated with Data Residency.

Recommendations:

Most governments around the world today have a shared services model for core ICT and Enterprise applications support.  We are seeing that government are now looking at cloud brokerage services being managed within the government which deals with Data Gravity issues.  And based on the nature of the API Economy we see that PaaS (Platform as a Service) are now being investigated and tested.   Therefore, we see central shared services agencies being the agent of change and will look to them to deploy Cognitive Computing PaaS as a service catalog that other government agencies and projects can leverage which will then ensure information is secure and protected depending on the type of information.

Opportunity to Innovate in Government Programs

Due to ability of cognitive computing to identify patterns or information at high speed and with large sets of information the opportunities in government are broad.  Since all information sources are able to be analyzed and combined (Databases and text etc.) a more complete picture is provided to an individual to make decisions.

Any area within a government program that houses a large set of information relevant to a specific domain: benefits, policy, regulations etc. Would benefit from cognitive computing since this information can be analyzed as well as added to a corpus of knowledge that the machine learning algorithms can access and analyze across dimensions such as time or relevance etc.

Six forces that will impact the future evolution of cognitive computing in Public Sector. 

Each facet has its own issues and challenges for this technology to be adopted.

Society

  • Tremendous demand for more intelligent machines and access through mobile devices can facilitate familiarity and comfort with technology
  • Fears of privacy breaches and machines taking human jobs could be a deterrent

Perception

  • Perceptions and expectations must be well managed
  • Unrealistic perceptions of risk and expectations could lead to a third “Artificial Intelligence AI Winter”

Policy

  • Wider adoption will require the modifying policies (e.g., data sharing) and creating new policies (e.g., decision traceability)
  • Fear, uncertainty and doubt may be addressed by new policies (e.g., data security & privacy)

Technology

  • Advanced, intelligent devices will enable a greater understanding of entity context and contribute to the robustness of available information corpora
  • Greater scalability needs drive new architectures and paradigms

Information

  • Variety and scalability capabilities of future systems will advance rapidly to cope with information exhaust
  • Information explosion could advance evolution and adoption rates

Skills

  • Cognitive computing demands unique skills such as natural language processing, machine learning
  • Greater availability of key skills will be key in the evolution and adoption of the capability

Recommendations: One must co-ordinate a strategy that revolves around the areas discussed above.  The fundamental challenges similar to cloud based computing in pubic sector will be policy and cultural change that needs to be managed in order for the information and technology to develop.

What is Cognitive Computing and Why Should Program Executive Care ?

What makes up Cognitive Computing

Cognitive Computing is comprised of three main functional areas.  Which are: natural language processing, machine learning and hypotheses testing.  All three functions of cognitive computing combine to provide greater flexibility.  This helps address a broader array of business problems in public sector. Business problems that could not have been solved earlier.  Natural Language Processing enables machine learning and discovery algorithms to interact with the end user in a more meaningful way.

Natural Language Processing (NLP)

            NLP describes a set of linguistic, statistical, and machine learning techniques that allow text to be analyzed. Which allows key information extraction for business value.  Natural Language analysis uses a pipeline processing approach.  Wherein the question or text is broken apart by algorithms.  So that the structure, intent, tone etc. is understood.  Specific domains of knowledge; such as legal, finance or social services, require targetted “dictionaries” or filters.  This helps to further improve the ability of the technology to understand what is being asked of it.

Some of the key benefits of NLP is it improves the interaction between human and systems.  Some additional benefits from NLP are as follows:

A questions contextual understanding can be derived from NLP. IT organizations develop a meta-data (information about information) strategy.  This gives more context to data and information sources.  The more meta-data and the more context added to a system; the better the understanding. This allows the improvement of finding information and then providing an answer back in natural language.  Instead of a page ranking result one would get from a typical search engine the response is in a form the user will understand.

The intent of a question is then better understood. Which means the cognitive system can better respond with a more meaningful response.  As well as return various responses with an associated confidence level.  This then gives the end user a more meaningful response with which to make a decision upon.

Natural Language Processing has taken interaction and access to information to a whole new level.  Which will in turn provide increased productivity and satisfaction to the end user.

Machine Learning

Machine Learning is all about using algorithms to help streamline organization, prediction and pattern recognition.   Big Data by itself can be daunting and only data scientists can build and interpret analysis by incorporating machine learning and natural language processing Big Data can only benefit from being easier to interpret by a broader user group.  Part of Machine Learnings “secret sauce” is deep neural networks to do information pattern analysis.

Deep neural networks, due to their multi-dimensional view can learn from regularities across layers of information which allows the machine learning algorithm to self-enhance its model and analysis parameters. This capability takes the onus away from the end user or data scientist to mine information from multiple sources.

The benefit to public sector programs is that deep domain knowledge will not be needed by the end user or the citizenry but have the machine learning algorithms to the heavy lifting and analysis for them.   

Historically organizations had to depend on limited ways to analyze and report on information which to a certain degree limited decision making and program outcomes.  Now with machine learning a key benefit is to access these systems with natural language or extremely flexible visualization tools which make decision making easier and more productive.  Since Cognitive systems are about knowledge automation vs. process automation.

Hypotheses Testing

A hypothesis is a proposed answer to pre-existing or understood responses.  From there a cognitive application will use the information that resides within a certain corpus or domain of knowledge to test the hypothesis.  Unlike humans who typically test hypothesis in a serial fashion one of the key benefits of a cognitive system is that it can test hundreds of hypothesis in parallel.  We see this occurring in areas such as health care or intelligence where various proposed outcomes are tested against a domain of knowledge.  The domain of knowledge can be comprised of many different sources and types of information. Given that cognitive systems have the ability to test large volumes of hypothesis at a high volume.  Programs and applications can benefit from the ability to provide improved means to make decisions with confidence and also to remove the “noise” surrounding what is trying to be resolved.  Some benefits from hypothesis testing are:

Causal Induction provides a key benefit to the user since it is based on statistical models deriving insight from a corpus or domain of knowledge.  As these models become more refined the ability to derive insightful responses provide more meaningful interactions with the end user or citizen.

Probabilistic Reasoning can generate multiple responses to a question which provides the user to see all aspects of an outcome versus generating a specific bias to the problem at hand.  This predicated on the system having enough context and also arriving at a specific level of confidence to provide an answer to the question.  As systems learn through interaction and feedback they will be able to identify if information is missing in order to provide an answer which again enhances the decision making process of a project or program

In summary; Cognitive Systems combine natural language processing with advanced algorithms and modelling tools to aid workers to make decisions in a shorter period of time and/or to provide more meaningful insight to a larger domain/corpus of information which the end user would never have been able to access or analyze prior to Cognitive Computing technologies

Contextual Computing

Contextual Computing unlocking the power of enterprise data Infographic

Can new ECM technology ease the complexity of document declassification

 


Hidden complexity in document declassification 

The explosion in digital records has made declassification a challenge, but recent document-leak events, including WikiLeaks, have heightened the pressure to ensure that all government documents and records being considered for declassification are managed securely prior to release. Typically, only the agency that creates classified information can declassify it. If an agency produces a record containing its own information as well as information from one or more other agencies, it must not only review its own information for declassification but also must refer the record to each agency that “owns” the additional classified information in question. A final declassification determination on information in the record is made only after all agencies have rendered decisions on their respective equities. 

 

I believe this structural process has inherent productivity limitations: if even one agency has minimal content in a record, it must declassify its portion. Agencies must alleviate these complexities to improve the efficiency and accuracy of declassification. Integrated collaboration tools with ECM systems would alleviate this as well.

 

I am seeing more ECM technology being deployed that addresses these issues by not only managing the classification, organization and retention of information but also automating delegation and task management. This consolidated, end-to-end approach helps ensure that all aspects of document declassification occur in accordance with legislation and in the time frame required.

Plus with the fact that digital content is growing in volume but also in “type” means that ECM systems in governments needs to automate more and as I have said in the past integrate policies to the underlying process flows; in order to be truly efficient.

 

Thoughts ?

Access to information under pressure

In 2013, over 704,000 Freedom of Information Act (FOIA) requests were submitted to the United States Federal government, and roughly 95,000 remained backlogged at the end of the 2013 fiscal year. According to FOIA.gov, many requests were answered in an incomplete form and were usually missing information. Almost half of those requests were considered incomplete by the requestor because the government agency either denied the information or did not provide the information in full.

 

With press coverage highlighting cases in which FOIA requests were challenged in court and the sensitivity to declassification of government documents , government organizations are under pressure to be more responsive. There have been attempts to streamline the process of fulfilling an FOIA request and accessing information easier, but the onus still falls on the individual government agencies to manage fulfillment.

 

Beyond FOIA requests, many agencies need to declassify documentation in accordance with record, archival and compliance guidelines. Declassification can quickly become a challenge because the originating department that classified the document or parts of a document must manage the declassification of that content. Accordingly, various documents may have multiple agencies or departments that must coordinate and sometimes deny declassification. Mandatory declassification can be requested on its own or as part of an FOIA request, but the challenge to a declassification request must submitted separately. All of these moving parts and processes can result in delays, backlogs and high costs, as well as additional legal challenges around declassification.

 

I believe what is needed is to combine several technical capabilities to address the requirements for FOIA requests as well as the redaction and declassification of documents. Technically organizations should be able to integrate and automate more of the declassification workflow and information gathering to help agencies meet the timeframes of FOIA requests or declassification of documents.

 

We are reaching the limits of existing FOIA technology

 

With increased demand for transparency and access to information by policy analysts, legal system workers, journalists and citizens, the need to manage the FOIA process and respond in a timely fashion becomes paramount. The typical response time for a request for information is 20 days, but that time can quickly balloon as more requests arrive.

 

Requests can be complicated and may follow a variety of processes across government organizations. Plus, once information is declassified it must be appropriately managed, as do the growing numbers of court documents generated by legal challenges for information. For example, 372 FOIA lawsuits were filed in 2013, generating 1,800 documents. By automating tasks around the collection of documents and delegation of files, government organizations can better manage and measure timelines.

Taking the first steps to improve the situation

If your agency experiences challenges or delays fulfilling FOIA requests or declassifying documents, there are several steps you can take to improve efficiency:

 

Assess your agency’s current approach to handling FOIA or mandatory declassification requests.

Understand if there is a backlog of requests and if the information being provided to requestors is as complete as possible.

Identify where requests or processes can be improved and define metrics for success.

Establish the scope for your improvement project and then do both a tech and business assessment Ensure a robust business case is developed.

Thoughts ?

 

Update and Observations May 2014

I have been meeting with government and healthcare organizations over the past few months and it apparent that there is a general frustration with existing work based technology and applications.  I have discussed this earlier; that we are technically “rich” in our personal lives and technically “poor at work.

ERP, HRMS and CRM systems that have been deployed over the past 20 years had inherent user interface issues (i.e. they sucked).  We have done Business Value Assessments in the past where we found end users printing screen shots of the work they did in an ERP system because they did not trust the technology and feared being audited: that is fundamentally bad when you are supposedly using technology and reverting back to manual processes.

So with your personal life filled with smartphones, tablets, smart televisions, smart thermostats that talk to your smartphone etc.; and the “Appification” of everything where you the consumer have a broad choice of applications, data and platforms to interact with said software and content your expectations rise rapidly.

So when you show up to work on a Monday and look at multiple login screens to archaic and dysfunctional technology you get depressed (which I would argue affects your productivity ).  You have diligently documented how to get work done via the “Cheat Sheets” at your cubicle or work area and you plod along.  Juxtaposed with what is on your smartphone on your hip vs. what you look at on your workstation.  Things need to change you need to be able to have a richer work experience.

So, as  mentioned my discussions with clients in all areas of public sector have been interesting.  They want to be more effective in getting work done with technology that provides what they want when they want and on the platform of their choice.  So over the next few posts I will be covering:

Work Optimization – Think about what method you use to get work done.  Task Outcome or Time Sensitive work are done differently and there is always a process or workflow productive people stick to; so understanding how people work becomes more important in deploying technology.  

Open Data – What does this mean and how does ECM fit into this.  Since with Open Data Sets organizations have information in a format that can be repurposed and presented for constituents to be better informed or government agencies to enhance their decision making process in getting work done.

Contextual Computing – IBM has done a recent study on this new developing area and how it helps people work better with better decision making etc.

As always I look forward to questions of discussion.

%d bloggers like this: