Why did our (blank) program fail?

Reality

The statistics are staggering.  The one that always stands out to me is the Standish Chaos Report.

This research shows a staggering 31.1% of projects will be canceled before they ever get completed.   Further results indicate 52.7% of projects will cost 189% of their original estimates.  On the success side, the average is only 16.2% for software projects that are completed on- time and on-budget. In the larger companies, the news is even worse: only 9% of their projects come in on-time and on-budget. And, even when these projects are completed, many are no more than a mere shadow of their original specification requirements. Projects completed by the largest American companies have only approximately 42% of the originally-proposed features and functions.  Ouch!!!

While this report focuses on technology projects, research on business results in projects is highly correlated.

Over the course of this 10 part series, we will look at 10 reasons why this (blank) fails.

First Reason-Vision

The executive team doesn’t have a clear vision.

Often said, maybe not heard, a lot.  But what does it mean?  The first part is understanding.  Often, business leaders don’t understand what it is that they are “buying”.  They hear a buzz word, go full speed ahead, without really understanding and agreeing on what they are embarking upon.  Because of this, the dynamic becomes fire, aim, ready versus, well, you know.

Vision is more than just getting a thought in your head and then telling the team to go do it.  It is more than a statement.  According to James M. Kouzes and Barry Posner:

“As counter-intuitive as it might seem, then, the best way to lead people into the future is to connect with them deeply in the present. The only visions that take hold are shared visions—and you will create them only when you listen very, very closely to others, appreciate their hopes, and attend to their needs. The best leaders are able to bring their people into the future because they engage in the oldest form of research: They observe the human condition.”

Shared vision, that is what it is all about.  Creating this with as many people as you can will lead to higher rates of success.

First step

A tool to deal with this failure point is the change equation.  Created by David Gleicher in the early 1960s, and refined by Kathie Dannemiller in the 1980s, the formula for change provides a model to assess the relative strengths affecting the likely success of organizational change programs.

The change equation is D x V x F > R.

This means that if the product of Dissatisfaction with the current state x the vision of what is possible by the leadership team x the First concrete steps that can be taken to reach that vision is more than the Resistance to the change, the change will be successful.

If the leadership team doesn’t know what success looks like and more importantly how to communicate that to the team, the effort is doomed from the start.

You can rate these items on a scale of 1-5, as well as rate the resistance of all the key stakeholders or change targets and see where you stand-BEFORE YOU EVEN EMBARK ON THE EFFORT!

Another Tool

Another interesting tool is Google’s reWork platform.  They have a tool here:

https://rework.withgoogle.com/guides/managers-set-and-communicate-a-team-vision/steps/create-a-vision-with-the-team/

This takes the team through the process of identifying:

  1. Core Values
  2. Purpose
  3. Mission
  4. Strategy
  5. Goals

Establishing these things, hopefully collectively with the team, together they make up the team vision.  A very interesting approach.

Of course, the F in the equation speaks to the first steps towards attaining that vision.  This is critical as a vision not executed is just a wish, one of those hopey changy things.

Transitioning from Vision to Action is what F is all about.  You don’t have to know all the answers, but having some view of how to move off the “X” will get the team moving.  Don’t wait to bake in all the answers, make a rough plan to get started and move!  Think about it as a trip across the United States from New York City to LA.  You can plan the whole thing from start to finish, but will that be wise?  Traffic, weather, construction will all change your route along the way.  So make a rough plan, start the journey and react and change as needed.

I’ll leave you with a quote from John Carmack: “A strong team can take any crazy vision and turn it into reality.”

Global Terrorism Database Analysis

 

Terrorism Statistics from The Global Terrorism Database (from Gary LaFree, http://www.terrorismanalysts.com/pt/index.php/pot/article/view/89/html

82000 Incidents since 1970

1. The top 20 countries and territories in terms of terrorist attacks account for nearly 72 per cent of all terrorist activities while constituting less than 10 per cent of all countries of the world. Two per cent of the world’s countries account for more than 27 per cent of the world’s terrorist attacks. Five per cent of the world’s countries account for half of the world’s terrorist attacks. (Pure Pareto)

2. In the four years prior to 9/11 worldwide terrorist attacks and fatal attacks were at their lowest level in 20 years. However, both total and fatal attacks have increased considerably since then.

3. Columbia is the most frequently attacked and Iraq has the highest fatalities.

4. 3.4% of the attacks and 9.4% of fatalities are in the US vs. Non-US.

5. The Shining Path (Peru) has the most events and fatalities of terrorist organizations.

6. 55% of terrorist events result in 0 fatalities with 1% resulting in over 25 fatalities.

7.  44% of events involve explosives and 36% involve firearms.

8. Nearly 75 per cent of the terrorist organizations identified in the GTD from 1970 to 1997 lasted for less than a year. These results suggest that most terrorist groups are like most business start ups, very likely to disappear during their first year of operation. Forming and maintaining groups is not easy, despite impressions to the contrary from the media.

Team SPEARHEAD Red Light and Blue 006 Fundraiser

RLB006 Banner 851x315

Fundraiser to support the Charlotte Bridge Home and their programs for local veterans!
Join special forces soldiers, veterans, public servants, patriots, families, and fitness enthusiasts in a weekend of fun and challenges.

Great for those interested in endurance events, team challenges, cross fit, hiking, running, and family fun.

Challenge yourself, teammates, workmates, and your friends all while taking part in a fundraiser for a great cause!

Work together to succeed! 99% completion rate!

Choose an event that is right for you!

Or, join us all weekend for the full Gauntlet!

REGISTER ONLINE: http://buytickets.at/teamspearhead/98606

EVENT DESCRIPTIONS
SHOOT
Friday, August 18th – 7am-1pm

Charlotte Pistol and Rifle Club

9130 Kensington Drive; Waxhaw, NC 28173

Beginners and Advanced shooters welcome. Active Duty and Veteran special forces instructors. Pistols and rifles. Bring your gun and ammo or bring $ for ammo.

$65.00

GAMBIT – SCAVENGER Hunt
Friday, August 18th – 2-6pm

Meet at Seaboard Taproom and Winery

213 N Trade Street; Matthews, NC 28105

Work in teams of 4-8 to find points of interests and complete challenges. Cell phone needed to upload pictures to app.

$35.00

WAR STORIES & FREE BEER
Friday, August 18th – 6-9pm

Seaboard Taproom and Winery

213 N Trade Street; Matthews, NC 28105

Unwind and enjoy comradery, beer, snacks and hear from active duty Army Special Forces soldiers telling stories and experiences from years of service to our nation.

$10.00

GO RUCK LIGHT
Saturday, August 19th – 9am-3pm

Starting point: Matthews Elementary School

200 W McDowell St, Matthews, NC 28105

Red, Light & Blue Main Event! Army Special Forces Soldiers will lead, teach & challenge you to work as a team on a 6-hour military endurance event. Bring a ruck sack, 4-6 bricks, water, ID, snacks, a headlamp & $20 quitters cash.

$75.00

GAUNTLET
Complete all 4 events and receive a 5th bonus patch!

Earn a special patch (and bragging rights!) for completing all 4 patch-events!
AFTER PARTY
Saturday, August 19th – 3:30pm-7pm

Beantown Tavern

130 Matthews Station St, Matthews, NC 28105

Go Ruck Light participants and their supporters are invited to join us after the Light for refreshments, to unwind, share stories, and debrief.
Prices and REGISTER ONLINE: http://buytickets.at/teamspearhead/98606

Data Science and Six Sigma-What gives?

Is Data Science the Next Step Along the Six Sigma/Continuous Improvement Journey? Maybe

As I have written before, a common question I get is whether or not six sigma is still in existence.  This of course is not an easy question to answer.  Some of the “big” companies have already burned through six sigma.  Some small to medium size companies are just beginning it, and some are starting it and calling it something else, maybe even data science.

Maybe there is a different answer.  Below is a one picture definition of data science:

datascience

Data Science in One Picture

(source: click here for more)

I would like to discuss six sigma in this context.

Part 1

Define

The first block in the lower left, data quality is the first step in the data science process.  This means that the data must first be analyzed for how accurate and precise it is.  In six sigma speak, this would be called a Measurement System Analysis or Gage R&R study actually done in the Measure phase.  While these six sigma tools are fine, there are some data science tools that are much more powerful to analyze the quality of the data and clean it for further use.  As a Six Sigma Master Black Belt (SSMBB), I would use these tools in a project context.  As a data scientist, I tend to look at data quality from a more global perspective.  Meaning, as I clean data for a given effort or project, I begin to feed that back into the overall data architecture and governance of the business to ultimately improve global data quality vs. optimizing for a project.

Measure

The next step in the journey is descriptive statistics.  In a six sigma project, this is done to characterize the data and understand what the data is telling us.  It may be done to understand the “Y”s in the Y=f(x) equation.  In the upper left of this picture is it talks about data visualization.  In six sigma, we would call this exploratory data analysis or EDA.  In a six sigma project, descriptive statistics and EDA are usually done in a stat package like Minitab.  In the data science space, most analysis is done in a tool of choice, usually R or Python.  I am an R fan.  In my current projects, I use R and the associated packages that are part of R to “torture” the data on a much more comprehensive basis.  I also do it on a more iterative basis, meaning initial exploration drives more questions about what the data is telling me.  I’m not trying to compare Minitab (or JMP, or whatever you use) to R, I just wish I knew about R when I was working on six sigma projects in the past.  It is much more robust.  In six sigma, these to steps are usually done during the Define (D) and Measure (M) phase of a project. (The total six sigma project cycle is D,M,A(Analyze), I(Improve) and C(Control)

Analyze

Once you understand the variables you are dealing with, you must begin to determine the root cause of your problem.  Right in the center of this diagram, we see Diagnostics, Identifying Factors and Causes.  This would potentially be the x in the y=f(x) equation.  What are the root causes or variables that effect the outcome.  This would put you in the Analyze phase of a six sigma project.  While doing six sigma projects, it seems the focus could be very linear.  I think of the DMAIC projects that I did as following a recipe.  Hypothesis testing, normality, chi-square, correlation, regression, ANOVA, T-Test and Multi-Var analysis.  These are all powerful tools.  Using the recipe analogy from above, I would say that where the data science approach differs is that you have a common set of ingredients, the data frame, and with R, you can make a whole bunch of different recipes that are complimentary and heading in the same direction giving you an entire meal, vs. the six sigma approach that gives you one really good (read optimized) entrée.   By applying several R packages to the same data set, you get a more global view of your data and interactions.

Improve

As we move to the next step up the line, we now see Simulation and Optimization and Forecasting and Probabilities.  In the six sigma space, we might do Design of Experiments and Process Capability which are powerful tools.  In the data science side of the house, R again provides a variety of packages to make the simulation and optimization more approachable and robust.  One of the things that hit me about R and data science, is that the “price of entry” to use the tools is much lower.  What I mean by this is that I trained for probably 3 plus years to become and MBB.  To learn how to effectively apply the data science approach only took a third of the time.  Now granted, I already had the MBB/statistics background, but the learning seemed more in tune with a business focus.

During the final phases of a six sigma project, you may use some tools to test whether the project is effective.  One technique I wish I had known for past six sigma projects was machine learning/data mining.  While there have been vast advances in this area during recent years to make is less complex, and the scope of this article is beyond a detailed explanation, this technique provides a very efficient way to test hypothesis and determine if there have been major process changes.

The final part of the diagram, the semantic analysis has more to do with a type of analysis, text data vs. numbers.  This type of analysis in a six sigma process would usually be done during the analyze phase.  I have done some of this analysis on survey data in past projects, but this could be anything as simple as a word cloud, up to sentiment analysis.

Don’t read into this article that I am trying to have these two methodologies or tool sets compete.  I am just doing a little comparison, as well as describing how I will leverage both in future projects.

We will do a case study, comparing and contrasting six sigma vs. data science in Part 2 of this article.

The Work Breakdown Structure-WBS-What Bull Sxxx is this?

If you are a PM, have studied for the PMP, or you are just a PM geek, you have heard of the work breakdown structure.

The Official Stuff-Defining the Work Breakdown Structure

The “official” definition of a WBS:

A work breakdown structure (WBS), in project management and systems engineering, is a deliverable-oriented decomposition of a project into smaller components. A work breakdown structure is a key project deliverable that organizes the team’s work into manageable sections.

When I first took my PMP exam, I didn’t really believe in the real use of the WBS. It seemed like a piece of nostalgia that only PMs in the 1970s would use. That could have been because I worked at organizations that had repeatable content or tasks and I never had to “build a project from scratch”.

Now that I am in consulting, there is much more talk about deliverables and such.

So let’s talk a little bit of strategy and tactics.

Here is an example WBS from an ERP implementation. If you are an ERP implementation ninja, please do not comment on the completeness of this. It is for illustrative purposes.

work breakdown structure

Think of a WBS as answering the question: “What are you doing?”

I am implementing an ERP system. Really, what does that include? It includes Discovery, Design, Build….

It is like answering the question, “What are you doing” in multiple layers of detail.

The Work Breakdown Structure Tool

The diagram above was created in a free online tool called appropriately WBS Tool. It can be found at http://www.wbstool.com

This diagram can be used to develop the scope of a project by breaking it down into it’s smallest working components sometimes called work packages.

Some principles to keep in mind:

1. Insure the diagram captures 100% of the scope of work to be completed.
2. Every element must be mutually exclusive, meaning 2 boxes on the WBS should not overlap.
3. Focus on outcomes not actions. This means that the things on the diagram should ultimately lead to a deliverable or milestone with the detail underneath. One way to accomplish this is to use the WBS as a basis for the project schedule.

Here is an example of how this tool allows for this.

work breakdown structure

I press this highlighted button and it creates an MS Project File for me.

Once I do this, I save the file as an XML file and open in MS Project as an XML file. This is what I get:

work breakdown structure

Now you can use the resulting file to plan your lower level tasks.  Generating the project schedule directly from the WBS, allows you to have greater traceability between the definition of the project scope in the WBS and the project schedule.  It then allows you to define the detailed tasks to complete the work packages.  WBS Tool also allows you to import MS Project Files as .xml. Just save your project file as an XML and them import into the tool.  The WBS tool also allows for the creation of graphics.  All of your files are saved in the cloud, so beware, don’t put anything proprietary on the site.

Benefits of a WBS:

  • Provides a solid foundation for planning and scheduling
  • Breaks down projects into manageable work packages
  • Provides a way to estimate project costs accurately
  • Makes sure no important deliverables get forgotten
  • Provides an ideal tool for project scope brainstorming and definition
  • Provides a proven and repeatable approach to planning projects

The WBS along with the Project Scope Statement and WBS Dictionary become the scope baseline for a project.  The WBS dictionary describes in detail, what each box on the WBS is.  This is also important to define in detail because a box like “Current State” has much room for interpretation.

Using the WBS Strategically

The change leader realizes that there always seems to be a debate or lack of understanding on what a project is delivering.  You can set up a WBS as the org chart of the project and it is a much more useful tool to illustrate what is included in a project.  As you construct the diagram, keep this in mind.  You want to keep the graphic concise enough to describe your project, but too many branches and it will be just as hard as a Gantt chart to decipher.

A valuable lesson learned from past project nightmares is to utilize the WBS to clearly layout and define the project scope with all the critical stakeholders.

 

Templates for PM-Sizzle or Steak

sizzle-steak-300x225

Project managers really really really really really love their templates.

Here are some great sources of templates:

PMI
PM Docs
Bright Hub

Templates can drive business leaders insane. Why?

Did you ever ask a project manager: “Have you finished the project charter?” The answer is usually yes. But what does finished mean?

Usually, when the PM says yes, they mean all the boxes are filled in. But finished really means does everyone who is a stakeholder in the project understand and agree with the charter?

It is easy to focus on filling in a form vs. insuring the form is value adding and has the proper data. This is why it sometimes comes across that project management is nothing more than a bunch of templates to be filled out and PMs are not more than order takers. Like filling in a bunch of office documents will somehow magically make the project deliver results.

So how do you insure that you have good project deliverables that are easy to digest and drive the project.

First, insure that your project documentation/deliverables are put together in a logical process and order. Don’t require the same information over and over again. The project title is one thing, but I have seen so many companies that have project templates that ask for the same information repeatedly. Even though you can cut and paste, why bother? Use each template for what it was intended to do. Also, don’t create a template if it will never be used again on the project. Each completed document, sometimes called an artifact of deliverable should be used in the current phase, and lay the foundation for follow-on information to be built in subsequent phases.

Second, make sure the templates have the right content. Critically review each section of the document to make sure that it is asking for real and value added information. You can start out with the table of contents. By reading this, does this represent the critical elements of information that this document should deliver? In recent discussions about the project charter, there is a trend that is putting as much information as possible in the project charter. This includes early information for many of the follow-on plans and project deliverables.

Take a look at the “new” project charter contents:

Cover Page:
Company Name/Logo
Project Name and Project Number
Date Created
Version Number
Revision History (Type / Date / Change and Section / By Whom)
Table of Contents
Project Title and Description
Project Scope (High-Level Deliverables and end result of the project)
Business Case (Why is the Project being done)
Project Manager Assigned and their Authority Level
Stakeholders and Stakeholder Requirements
Project Agreements: (who will provide what)
Products to be Installed/Upgraded
Exclusions (What the company is excluded from delivering)
Assumptions / Constraints
Acceptance Criteria
Measurable Project Objectives (measurable strategic goals)
High-Level Business Risks (potential risks and opportunities)
Project Plan and Milestones (Visual Project Task Spreadsheet)
High-Level Project Communications
Project Meetings – Structure / Agenda / How Often / Schedules
Weekly Project Dashboard and Status Reports – Schedules
Chart for Scheduled Meetings (meeting name, method, attendees, leader, frequency)
Contacts Lists (Name, Title, Phone, Email Address)
Requirements (High-level plan describing how requirements will be gathered)
Contingency Plans
Project Issues Management
Project Change Control Management
Project Training
Project Testing
Project Implementation
Project Closure

You can see here that many of the components of the project plan show up here…IN THE CHARTER!

When I first saw this trend, it bothered me. But the other trend I have seen is that clients want more detail, more early and more often.

While you may not know all the details, putting what you do know in the charter can be a way to proactively answer questions stakeholders may have.

The third and final recommendation is to review the blank template with the client first. This sets expectations about what data will be delivered and in what format. This is what I call the “plant phenomena”. If you don’t share the template with the client first, inevitably, the first review session will be all about the format of the document, the sections, the order. By sharing the template with the stakeholders prior to the actual content being added, it prepares people for what they will see. If you have a standard process and templates are reused, it is still a good refresher to review with the team. That insures that once the content is added, people will focus on the steak and not the plate it is being served on.

Templates are a great practice, but they must be used wisely to gain full business benefit! Bon Appetit!

Data-The Fuel That Runs Your Organization

Usually people ask me; “why do you call your blog execution engines?”.  I think of all the business tools, tactics and techniques out there as execution engines.  They help run your organization and move it forward.

In this analogy, I think of organizations not as a car, with one engine, but as a ship with many engines.

Carnival-cruise-ship-engines-MAN

These engines, like project management, six sigma, lean, name it, are all methods and pieces that when used in conjunction, propel your company forward.

So it takes may things to make these engines work, but the thing that is most needed is the fuel.  A cruise ship uses 380 tons a day and is fueled for 12 days at sea.  That’s a lot of fuel.  Sound familiar?

Organizations run on large amounts of data, but now let’s talk about that data.

Just like engine fuel, the data has to be of high quality.  If it is not, the engine will not run at the optimum, and if it is really bad, it won’t run at all.  People at all levels of the organization rely on high quality data to do their jobs, service customers, make decisions and move the ship.

I have seen many projects start out as one thing, and then the team quickly realizes that the project really needs to focus on the data first before anything else worthwhile can be done.  I worked on a six sigma project once where the goal was to fix a manufacturing process.  After doing our Gage R&R study, which looks at the measures of the process, we quickly realized that most of the data and measurements were bad.  The focus of the project shifted from fixing the process, to fixing the measurements.

In other projects, like ERP implementations, the project always spends a large amount of time realizing that the data is bad and it should be fixed it before something new is started.

So why is data quality so important to your business and to your projects?  Let’s take a look.

data-quality

Bad News First.  53% of companies have suffered losses, problems or cost due to poor data quality.  And the main reason data migration projects (or really any IT projects) fail is due to poor data quality.  But the good news is that companies with high quality data see a 57% increase in customer satisfaction!  In addition, research by ReachForce has shown that poor data hygiene has huge negative impacts on the sales cycle/funnel such as poor targeting and bad lead routing, but with the application of data quality best practices, there was a generation of a 66% increase in revenue.

So of course, the next question is, what do we mean by data quality?

There are six basic attributes of data quality:

  1. Validity-are all the data values within the critical process areas specified by the business?
  2. Accuracy-does the data reflect the real world and is it verifiable?
  3. Consistency-is the data consistent between systems, within systems, are there duplicates?
  4. Integrity-are the relationships between entities and attributes consistent?  Within and between tables?
  5. Timeliness-is the data available when needed?
  6. Completeness-is all data needed actually present?

Wow, that seems like a lot of dimensions!

Discussions about data are everywhere today.  Change leaders and project managers will have to lead the way in guiding organizations and considering data quality in all of the organization’s change work.

Some things to think of right out of the gate are performing a quality audit and establishing governance.

An audit can help you discover the source of data quality problems.  Data quality problems are often widespread and originate in your source systems, their applications, and operational processes.  The quality audit should focus exclusively on (1) verifying the quality of reported data, and (2) assessing the underlying data management and reporting systems.

Data Governance includes all the activities, of which Data Quality is part of, and covers the areas of data management, business process, compliance and people.  It also includes a linkage to technology.

You can download an outline of our data quality audit and data governance framework below.  This should get you started on your journey.

  • Sign Up Here




 

 

Welcome to the Jungle!

huge_8_40292

Boy, do we love our meetings or what?  They can be a real jungle.

Running the meeting is a whole other adventure.

If you want a shot of making an impact in the meeting, the other people in the room have to take you seriously, and great meeting introductions are your chance to make that oh-so-important good first impression.

For many people, these first minutes of a meeting will always be nerve-wracking. How the meeting leader handles these opening minutes can make a huge difference in the effectiveness of the conversation that follows.

Let’s start out with The Cardinal Rules of Leading Meeting Introductions:

Rule 1: Make sure everyone gets introduced.

If someone is important enough to be invited, they must be introduced. Introductions make sure the people in the meeting know who they’re talking to. They provide critical context for the discussion, giving everyone a sense for the range of perspectives and experience in the room. With an online meeting, having everyone introduce themselves also reveals any issues with audio or language differences.

This goes for latecomers and other people who walk into the room, too. While you shouldn’t interrupt someone to introduce a new attendee, make sure use the next pause to quickly do so. If you’re on a conference call and the CEO walks into the room behind you, the people on the other side of the phone deserve to know that the audience just changed.

Rule 2: Provide clear direction.

Tell people specifically what you want them to share with the group, and provide an example by introducing yourself first.

After explaining what you want to hear, cover the order in which people should speak. For online meetings, go top-to-bottom through the attendee list.

Rule 3: Keep it safe.

If you give clear instructions and provide an example by introducing yourself first, you’ll have a great start on alleviating anyone’s anxiety.

To further ensure you don’t inadvertently shut someone down:

Never ask people to share potentially sensitive information in a business setting.
Stay clear of topics that get too personal; not everyone has happy childhood memories, and lots of adults just don’t have a favorite band or ice cream flavor any more. If you must delve into the personal, save it for your team-building exercises and off-sites.

Don’t ask questions that make people feel they have to justify their right to be in the meeting.
You may need to understand the skills and expertise of the people present, but there are ways you can find this out without making someone feel like they’re being interviewed.

This doesn’t mean you have to keep introductions terse (Name & rank, attendee!) or boring (How’s the weather there, Steve?). Instead, craft an introduction question based on rule #4.

Rule 4: Make introductions relevant to the meeting.

Context (not content) is key. The best introductions will help everyone understand how each participant relates specifically to the situation at hand.

Are they there just to listen, or do they have an agenda of their own? Are they an expert in subject, or is this all completely new? Will they be in charge of decisions, or expected to carry them out?

Include at least one question in your introductions that ties directly to the goal of the meeting and reveals some of this context.

 

For business and professional meetings, introductions should always include:

  1. Each person’s first and last name
    1. Then, context, context, context!
  2. The company or department they represent
    1.  This is their business context.
  3. Current location (for remote attendees)
    1.  This is their personal context; important for understanding time zone concerns, possible connection issues, and background noise.
  4. Why they’re at the meeting
    1.  This is their meeting context.
  5. To get at this last one, you might ask:
    1. What’s the most important thing you want to get out of this meeting?
    2. What are you hoping to learn here today?
    3. What prompted you to be here today?
    4. What excites you most about the work we’re doing here?
    5. What skills can you contribute to the team that may not be obvious to the rest of us?

Using these techniques can give you not only control of your meeting, but also sets the tone for you as a organized and competent meeting leaders.

the-irony-of-meetings

And since, as the graphic shows, we spend a large part of our work time in meetings, this will help you lead from the front.

Data-It’s What’s Next

It’s interesting to me that while many of our engagements are project/program or change management, many of our clients also have, you guessed it, a data problem.

In my analogy of execution engines, I believe that data is the fuel that runs through and runs those engines, essentially it powers your organization.  But you knew that.

Over the past 6 months, I have been busy learning about data, specifically data science.  I am currently about half way through a Data Science Specialization at Johns Hopkins University and I had the opportunity to attend the Caltech-JPL Summer School on Big Data Analytics this year as well.  It has a lot of linkage to me with six sigma, but data analytics and such are starting to come into the mainstream.

You can now see more and more companies hiring data people, analysts, scientists, etc…

The JPL Summer school was 9 days of fun.  We learned about things like Big Data Architecture, Machine Learning, Programming, Content Detection and Analysis, Inference and Uncertainty, Databases, Data Visualization, Clustering and Classification, Decision Trees, Dimensionality Reduction, Semantic Web Technologies, Genetic Algorithms  and even an Introduction to Cloud Computing.  This all sounds really complicated, and yes there were many pocket protectors present.  But it was very fascinating to see where this emerging field is going.

Business and science are becoming increasingly more digital.  The core challenge is that there are now large complex distributed data sets, and to gain new knowledge, we must be able to effectively explore them.

We are in the age of a new paradigm.  Science was initially about experiment and measurement, then it moved to analytical theory (think Einstein), then to numerical simulation.  Now we find ourselves in the 4th paradigm, which is data driven science.

This mass of data is absolutely amazing.  Take astronomy for example.  Astronomy today has 10PB of data.  That is peta-bytes. (to make your head spin, click here )10TB a day is generated.  So essentially, data rates and data volumes double every 1.5 years.  In addition, not only does the volume of the data grow, the complexity of the data increases.

If you think about Moore’s law (boy I’m really geeking out) as it applies to data, Data will grow in size and complexity:

  1. From data poverty to data glut
  2. From data sets to data streams
  3. From static to dynamic, evolving data
  4. From any time to real time analysis and discovery
  5. From centralized to distributed resources
  6. From ownership of data to ownership of expertise.

Understanding complex phenomena requires complex data!

A new process will arise: Data Gathering–>Data Farming–>Data Mining–>Data Understanding–>New Knowledge

Much of the data flowing through this process will never be seen or able to comprehended by us mere humans.

Said another way, The Challenge is Big Data, The Solution is Data Science.  Addressing the challenge of Big Data is on the critical path for many organizations.

The presenters made TWO ENTIRE BOOKS available to us.  You can download them by signing up at the bottom of this page. 

NASA/JPL talked about how they use the data lifecycle on their deep space missions.  The data lifecycle as defined by them is:

Data Generation–>Data Triage–>Data Curation–>Data Transport–>Data Processing–>Data Mining/Visualization–>Data Analytics

From this, they feel there are common challenges in massive, distributed, heterogeneous data such as:

  • Defining the data lifecycle for different domains in science, engineering, business
  • Capturing well-architected and curated data repositories
  • Enabling access and integration of highly distributed, heterogeneous data
  • Developing novel statistical approaches for data preparation, integration and fusion
  • Supporting analysis and computation across highly distributed data environments
  • Developing mechanisms for identifying and extracting interesting features / patterns
  • Developing methodologies for reconciling predictive models vs. measurements
  • Methods for visualizing massive data
  • Providing a trusted basis for actionable results of data analytics

One solution to this is the Apache OODT: Object-Oriented Data Technologies.

It is an open source data science framework developed at NASA/JPL.  It is used across multiple NASA centers (JPL, GSFC, LaRC), multiple agencies (NASA, NIH, NSF, DARPA, NOAA), integrates with an information architecture – metadata and ontology (e.g., earth science, biomedicine, etc.), significantly reduces cost and increases performance of data processing and management, integrates with data analytics and has been applied to Earth science, planetary science, astronomy, biomedicine and defense.  You can find out more about the framework here.

Our firm is committed to helping our clients manage the data challenges they have in the future.  Look forward to our white paper; Data: The Fuel That Runs Your Organizational Engines due out 2/1/16.

  • Sign up here to get the two data analysis books for free!





 

 

 

 

Executive Briefing-Planning Running and Improving Your Business

Happy New Year!

To kick off the new year right on our blog, we have published a white paper.  You can find the synopsis and paper below!

Business leaders are continuously bombarded with buzzwords, advertising and “improve quick” schemes. While larger companies may have the infrastructure and means to search for and deploy a series of initiatives aimed at defining and improving their businesses, small to medium size companies struggle to find an approach that will get them the meaningful business results required to sustain their business.

Businesses at all levels are challenged to define the right strategy, execute it, and improve performance year over year. This raises another quandary, which is “what tools, techniques and processes should I use to get it done”.

Planning, Running and Improving Your Business

 

 



Web Analytics