Pages Menu
Categories Menu
Analytics

Analytics

Analytics is a Function, Not a Job Title

Posted by on Feb 16, 2018 in Analytics | 0 comments

Analytics is a Function, Not a Job Title

I recently read an HBR article that reinforced much of what we’ve been seeing internally at Management Concepts for the last two years: Analytics is a function, not a job title, and regardless of job title, analytics should be a part of every team’s profile. As noted by the HBR authors, you don’t have to be a Data Scientist to work in analytics. Success with analytics requires a “big tent” approach: Everybody in, and everybody all in. Our student information reflects this need. Management Concepts delivers extensive training in the domain of analytics. We offer 7 courses, primarily to Federal employees, that cover data collection techniques, data analysis and modeling techniques, evaluating and presenting analysis results, and data visualization. Our internal analysis showed us that the top 10 most common job titles of our analytics students are: Budget Analyst – 560 series Accountant – 510 series Program Analyst – 340 series Financial Management Analyst – 505 series Management Analyst – 343 series Contract Specialist – 1102 series Financial Analyst – 1160 series Auditor – 511 series IT Specialist – 2210 series Management & Program Analyst – 343 series This list reflects a wide variety of occupational series codes in Federal employment. While some codes function as “catch-alls” (e.g., the 343 series code) for positions that span many roles, other codes are very specific in their requirements (e.g., 1102). None of these roles has the analytics market cornered – to be truly effective, you need to staff your team with people who can conduct analyses. Whether you fill this need by hiring multiple staff members with analytics capabilities, or by training the function depends on your organization’s specific need. For small organizations just starting to use analytics in their operations, it’s not necessary to create an entire analytics unit. You can train, contract, or hire for the role. Learn more about how we can help you pursue these options by visiting...

read more

Adaptability: The Underestimated Skill

Posted by on Jan 16, 2018 in Analytics | 0 comments

Adaptability: The Underestimated Skill

We all know the definition of insanity – doing the exact same thing over and over again, expecting different results each time.  But what happens when you’re trained to always run the same report, or write the same code, and something fundamentally changes?  Do you try to force-feed the existing process into the new system?  Or do you thoughtfully and methodically work within the new environment to solve the problem?  Adaptability isn’t always a skill associated with analysts, but I’d argue that, for everyone’s sanity, it’s one we all need. Picture this – you manage a monthly report.  Everything is automated cleanly, and save for a few quick gut-checks, there is very little manual work required.  Then your office updates its software, or switches database management systems.  Suddenly your process doesn’t work at all, or if it does, it returns vastly different results.  You make a relatively minor tweak, knowing that this has worked countless times before, and try again.  And again.  And again.  And the insanity begins to kick in. The good news is, contrary to popular belief, adaptability can be learned.  When I find myself banging my head against the wall, I try to employ one of these strategies: Take a step back.  Leave the problem for an hour or two (or better yet, overnight!), and focus your energies on something different. Try to approach it from someone else’s perspective.  Think through the questions you would ask if someone else approached you with this problem. Collaborate.  Ask a colleague, and frame the problem broadly so you don’t limit their response based on what you’ve already tried. Similarly to how you hear “ya’ll” in the south and “yinz” in Pittsburgh even though they mean the same thing, coding languages and formulas are sometimes written slightly differently based on the environment.  If you have experience with Tableau, you know the frustration of trying to find the correct syntax when replicating an Excel formula you have already perfected.  Or if you’ve worked in SQL, switching from Oracle to Microsoft SQL Server can be the impetus for Googling “translate PL/SQL to T-SQL”.  In each scenario, adaptability is critical to success.  If you have the skills to adjust to your circumstances, you are less likely to spend hours trying to force an old solution into a new problem, and more likely to avoid the insanity that can come with it. With these tips, and the right training and resources, analysts can learn to adapt to the right language for their environment, and avoid going insane. Let Management Concepts help you excel in your organization by providing you with the right mix of training to increase your problem solving and adaptability skills to compliment your analytics...

read more

Data, Data Everywhere…Albatross or Opportunity?

Posted by on Nov 30, 2017 in Analytics | 0 comments

Data, Data Everywhere…Albatross or Opportunity?

Samuel Taylor Coleridge’s Rime of the Ancient Mariner tells the experience of a sailor returning from a long voyage. In the early parts of the poem, the mariner describes how an albatross leads the crew out an ice jam in the Antarctic, but the mariner then kills the bird. The crew vacillates between viewing the killing of the albatross as a good or a bad thing. However, they soon become surrounded by “water, water, everywhere, nor any drop to drink.” Eventually, the mariner encounters death, but survives after accepting his guilt – however as a penance for shooting the bird, the sailor must wander the earth telling his story to passersby. This poem introduced a number of concepts into our common understanding, particularly the metaphors of the albatross and being surrounded by something (in this case, water) that is of no use in its present form. Both metaphors apply to the current status of organizational decision making in both Federal and private sector. Like the ship in the poem, we are surrounded by data. But the question is whether we use the data to make decisions – are the data just data? Or can they drive decisions? Can they be evidence? Are these data points an opportunity or an albatross? Can this data be the water surrounding us that we are unable to drink? One recent public law is trying to clarify and rectify this situation. The Commission on Evidence-Based Policymaking (CEP) was established by the bipartisan Evidence-Based Policymaking Commission Act of 2016 (P.L. 114-140), jointly sponsored by Speaker Paul Ryan (R-WI) and Senator Patty Murray (D-WA), and signed by President Barack Obama on March 30, 2016. The Act aim to address the use of existing data to improve how government programs operate. The mission of the Commission was to develop a strategy for increasing the availability and use of data to build evidence about government programs, while protecting privacy and confidentiality. In September 2017, the Commission issued their report outlining recommendations to improve data access, strengthen privacy, and enhance government capacity for building evidence for recommendations. I attended the CEP’s briefing of the report hosted by Washington Evaluators, a professional association that I recently joined. I also attended a session last week at American Evaluation Association’s Evaluation 2017 which also highlighted the report. These panels discussed the results of CEP’s final report. The most interesting and relevant points from my point of view is the recommendations related to increasing capability in evaluation. To be able to evaluate the effectiveness of programs, and use the available data, Federal agencies need to focus on building their evaluation competency. I’ll close with some simple guidance: It is difficult to deal with the volume of data we have available to improve our decision making. But our operating environment does not have to be “data, data everywhere.” Small efforts can make a big difference. Most organizations in the private sector are not yet at this level of sophistication. There is hope, though, in keeping it simple – by choosing and using the right evaluation approach and the right analytics for the right questions, your organization can do evidence-based decision making and turn the data you have into an opportunity instead of an albatross. Learn more about our training and services to help your organization increase...

read more

What Is Data Science and Why Do We Need More Data Scientists?

Posted by on Sep 19, 2017 in Analytics | 0 comments

What Is Data Science and Why Do We Need More Data Scientists?

If you take a quick look at Glassdoor.com and their “50 Best Jobs in America” for 2017, you will notice that Data Scientist ranks at #1, among several other “data/analytics” titles in the list. Although this is evidently a highly desirable job, there appears to be a large talent gap in the field. Recent research by McKinsey Global Institute has projected that by next year, the US could face a shortage of 140,000 to 190,000 professionals with deep analytical skills, i.e., advanced statistical training and knowledge of machine learning. What exactly is data science? While not an entirely new concept, the terms “data science” and “data scientist” have exploded in popularity in the past 5-10 years. At its core, data science is about using scientific methods along with new technology systems to gather insights from massive amounts of heterogeneous data. For the Federal Government in particular, there’s much to be done in order to attract and retain top data scientists who will work with some of the largest databases in existence. The Networking and Information Technology Research and Development (NITRD) Program has come up with a plan to address this issue. One of their strategies is to “Improve the national landscape for Big Data education and training to fulfill increasing demand for both deep analytical talent and analytical capacity for the broader workforce.” So there is clearly a focus on growing data science programs and preparing the next generation of super-savvy data nerds… but why? The importance of data-driven decision making cannot be overstated Data scientists are skilled at asking the right questions Data scientists are uniquely equipped to make sense of big data 1. Data-Driven Decisions In these fiscally-constrained times, more and more agencies are being assessed on how effectively they are achieving their mission. Decisions regarding how to deploy the government’s biggest resources – people and money – for maximum efficiency must be evidence-based. However, there are many challenges to becoming a fully data-driven organization. A recent article by Deloitte University Press describes these obstacles, but also provides solutions and success stories for each obstacle. 2. Defining the Data Question/Need Any worthwhile analysis begins with a question. The type of question determines what kind of data and which analyses are required. Data scientists have a strong background in research, which allows them to help frame the question and select the appropriate method of data analysis. It’s important to note that data can be used to answer many questions, but not all of them. “The data may not contain the answer. The combination of some data and an aching desire for an answer does not ensure that a reasonable answer can be extracted from a given body of data.” -John Tukey 3. Big Data Any data-related article you read within the past 5 years will make reference to the exponential growth in both the depth and breadth of data available (“big data”). This is perhaps best illustrated by IBM’s estimate that 90 percent of the world’s data has been generated in the past two years. And that assertion was made in 2013, so the percentage is now likely even higher. We can no longer brush off the notion that big data is a global phenomenon; its potential benefits are now well-documented. It has already begun to shape the...

read more

Same Data, Different Conclusions: It’s a Good Thing

Posted by on Aug 15, 2017 in Analytics, Workforce Management | 2 comments

Same Data, Different Conclusions: It’s a Good Thing

In just a couple weeks, I will be on my way to Italy for a two-week vacation! Che bella! When I was deciding where to go with my husband, I experienced a bit of work-related decision-making déjà vu. Every option we considered was very different from the next, but all were possibilities chosen based on the same exact set of data (number of vacation days available, budget, etc.). I felt like I was back at the office working with a team, arguing about what decision to make, even though we’re usually reviewing the same data. The difficulty can be that different does not necessarily mean wrong in these situations. So, how do you successfully prepare for those conversations where everyone’s perspectives are different, but also valid? First, think through your own decision-making process: How did you land on your decision? How did you weigh/prioritize each piece of information? Maybe you want to spend as many days as possible on big, once-in-a-lifetime vacations, and so you decide to go all-out on a long Safari trip in Africa. Or maybe you’d rather scatter PTO on shorter vacations to places close by, always with a buffer day to recover when you get home, so you plan several city-hopping long weekends to Miami, Seattle, San Diego, and Boston throughout the year. Once you understand your own process, you’ll be better-equipped to collaborate with others. As you enter these conversations, keep in mind that everyone is going to approach a decision from their own unique perspective. Ask yourself: How could others interpret this data? How might your own decision change if you had additional data? Is there new information that others have that might influence your decision? What decision, or rather, what desired outcome of the decision, is best for the mission? If you take a few moments to think through the answers ahead of time, you’ll be more prepared for the conversations, and likely more open to possibilities. It will help you understand where others are coming from, as well as what potential differences to expect. More often than not, analyses do not point to a single decision. This can invite conflict when stakeholders bring different perspectives to the table. By arming yourself with your own decision-making process, and an understanding of how others might interpret data, you are setting yourself up for success. Data-driven decision making, naturally, is equal parts data analysis and collaborative, constructive decision-making skills. For more strategies, techniques, skills, processes, and models on this subject, along with training for influencing stakeholders, register here for our new two-day course, Data-Driven Decision Making. And just remember, an African safari isn’t wrong, it’s just different! Ciao! — Enjoy this blog post? Check out our other recent Analytics blogs—and don’t miss our other posts on data-driven decision making. And don’t forget to subscribe to this blog by using the form at the top-right of this...

read more

Defining the Problem – How Savvy Data Pros Get It Right

Posted by on Jul 23, 2017 in Analytics | 0 comments

Defining the Problem – How Savvy Data Pros Get It Right

Ever hear the phrase, “The first step is the hardest”? While it may be a common phrase for motivational posters, it also applies to the first step in making data-driven decisions: defining the problem you’re trying to solve. Without a clearly or accurately defined problem, time and other resources will be wasted, and you’ll be left unable to make an effective decision. Often, nailing down exactly what you’re trying to solve is more difficult than the analysis itself. Analysts – how many times has a decision-maker come to you looking for specific metrics or data points without providing any context? Decision makers – how many times has an analyst followed your direction, but the output doesn’t help to solve the problem? The good news is there are steps that can be taken to avoid these common roadblocks – but it’s a two-way street. Decision makers should empower analysts to ask questions, and keep an open mind about potential analysis outcomes, regardless of any preconceived notions of which data points are the “right” ones. Analysts should ask questions, and not limit themselves to the metrics or analysis methods originally proposed. This will provide an environment in which the analysis plan can be structured to solve the question, not just spit out the data points requested. Easier said than done, huh? As a best practice, use the questions below to guide conversations around defining a problem. Analysts should be prepared to ask these types of questions, and decision makers should be prepared to answer them. Both groups should be prepared to potentially adjust their expectations based on the outcome of the conversation. 1. What is the problem you’re encountering, and what is the context around it? 2. What decisions would you like to make based on the outcomes of the data analysis project/request? By collaboratively defining a problem, analysts and decision makers will make the first step in data-driven decision making that much easier. It will lead to less wasted time, and more effective, strategic decisions. The first step is usually the hardest – but it doesn’t have to be! For more strategies, techniques, skills, processes, and models on this subject, along with training for influencing stakeholders, register here for our new two-day course Data-Driven Decision Making. And don’t forget to subscribe to this blog by using the form at the top-right of this...

read more

A Perfect Match: Data Analysts Pair Technical Skills with Soft Skills

Posted by on May 30, 2017 in Analytics, Workforce Management | 4 comments

A Perfect Match: Data Analysts Pair Technical Skills with Soft Skills

I hope everyone had a good Memorial Day Weekend – the unofficial kickoff to wedding season! In that spirit, I’m going to discuss a very important relationship in every data analyst’s life: the marriage of technical skills to their soft-skill counterparts. Just like every relationship, this one requires understanding and balancing complementary aspects to be successful. For our model skill-coupling, let’s look at how the skill for knowing how your sphere of influence—i.e. awareness of what you can do yourself, and how you can influence others—works hand-in-hand with your skill for driving data-based decisions. In data-reliant work, awareness of your sphere of influence and data-driven decision making make for natural career-long skill partnerships. They work better together, and enable you and your team to do the best work. Here are two ways these skills blend and support each other: Understanding When interpreting the results of an analysis for the purposes of making a data-driven recommendation, it is critical to understand your sphere of influence. For example, say an analysis is requested because your organization wants to know the best way to increase profits. If hiring decisions are out of your (and your supervisor’s) control, then you shouldn’t suggest adding more staff. Instead, focus on data that supports efficiency and cost-cutting measures within your influence. Data-driven recommendations are more effective when they consider both the analyst’s and the decision-maker’s sphere of influence, and are centered around practical action steps. Balance Consideration for sphere of influence must come with a healthy dose of technical analysis. The balancing act of building a practical recommendation based on what you can influence, and following the data’s lead, can be a difficult one to master. If you lean too far towards what you can influence, sometimes the decisions aren’t strongly supported by data. On the flip side, basing recommendations solely on data, without any organizational context, can lead to impractical decisions that stray from strategy. Striking the right balance allows decisions to be not only actionable and immediately useful, but mindfully and strategically aligned with what the rest of the organization is working toward. To achieve balance and understanding, try asking yourself three questions whenever your data analysis reaches the point of needing an action or a decision from others: Why is this decision important and why is it a good move? What about this issue is in your control, what is not, and who can help? Does my analysis support our strategy, and is that clear to others? Technical, hard-analytical skills and soft skills are natural, complementary partners—they make up for what the other lacks, and the two are more powerful together than apart. When united, they can take data analysis, decision making, and people to the next level. But as common wisdom would have it, relationships take work. Register for our upcoming training opportunities to improve your decision making and evaluating and presenting analysis, and other analytics skill areas, and learn how you can improve your strategic communication and influencing...

read more

Put Your Data Visuals on a Diet: No More Pie Charts!

Posted by on Mar 13, 2017 in Analytics, Workforce Management | 0 comments

Put Your Data Visuals on a Diet: No More Pie Charts!

  Happy Pi Day! To celebrate, I will be discussing one of the most widely-used (but not widely useful!) data visualizations: the Pie Chart. Data-driven, strategic decisions are only as good as the information behind them—and to decision-makers presented with data, the information’s only as good as the chart or graph that represents it. As data analysis is used more and more, the communication of actionable information becomes just as important as the information itself. While charts and graphs are invaluable tools for easily and quickly communicating a complex message to an audience, if they aren’t wisely designed, they can also be misinterpreted, misleading, or even deterrents to action. And pie charts—despite their deceptively simple style and popularity—often lead viewers astray, or lead nowhere at all. So, here are 3.14 Reasons to Never Use Pie Charts! 1. Distortion of the Information 3D and ‘explosion’ effects are very common pie chart features. But look at the following example of regional sales data—can you tell which region had the most first-quarter sales? I can’t tell either, and I built the chart! The explosion makes it difficult to gauge the slices’ sizes in relation to each other. And when using the 3D effect, slices that are closer to the reader appear larger than the others. Let’s remove the 3D and explosion effects and see if that helps. Still stuck? Me too. Dark colors naturally look larger than light colors, making it difficult to compare. Instead of a pie chart, use a bar chart that you can sort and label efficiently. Bar charts can often display the same exact data without distorting the information—and audiences are always relieved by easy-to-read charts. 2. Difficulty Communicating the Information How many times have you seen a pie chart with too many slices and a crowded legend? If your eyes need to move back and forth between the chart and the legend (and who can squint hard enough to read that legend, right?), you’re not focusing on the information or what it means. Instead, your brainpower is spent remembering which color matches each legend entry. Let’s try to fix it by limiting the number of pie flavors. This shows the top pie preferences, and buckets everything else into ‘Other’. But we’re now inviting questions about what falls into the mysterious ‘Other’ category rather than effectively communicating the pie preferences. (And if you hadn’t noticed, we’re experiencing distortion of information with the colors—e.g. is the dark blue slice twice the size of the orange slice? Hard to say!) A bar chart allows you to communicate the important information without inviting tangential (or completely unrelated!) questions. 3. Difficult to Draw Meaningful Conclusions This is the result of reasons #1 and #2. When a data visualization does not accurately represent or effectively communicate the data, one of two things is most likely to happen. One – you’re going to make a decision that you shouldn’t. Or two – you’re not going to make any decision at all. In today’s environment it is increasingly important to be able to make data-driven, strategic decisions. That’s so much easier when you use a chart style that fits your data and your message. Most often, a pie chart is not the right choice. 3.14. Pie is for eating, not for data. Don’t get me...

read more

In 2017, Show the Way—With Data

Posted by on Jan 26, 2017 in Analytics | 0 comments

In 2017, Show the Way—With Data

This year, more than ever, Federal government leaders will benefit from those of us who are able to visualize data effectively to get their message across. There’s no better way to illustrate the ROI of a program than to literally illustrate it—with effective, irrefutable visuals of the data—which can serve as common ground for discussions about the work being analyzed. As we begin the first 100 days of new leadership in the Federal government, we’ve heard many reports from the Trump administration and media analysts about an increased level of accountability when it comes to government spending. Data-driven accountability supported by well-done analysis brings the advantages of precise measures, proof of performance, and clear communication about the bottom line. I mentioned in a previous blog post that analytics can keep us honest. Let’s look at this new administration as an opportunity to focus on analytics and honest assessments about program costs, benefits, and the impact on the constituents that each business unit serves. Interdisciplinary research on data visualization chronicled by scholars including Stephen Few and Colin Ware teach us that visuals help us understand the data, relate to it, and remember it. To show your program’s value, illustrate its impact and relative value to the organization’s mission to show why investments should be redirected or eliminated, illustrate the tradeoffs. (Check out our complimentary download for tips on The Language of Data Visualization.) Effective and worthwhile analytics is analytics that includes visuals. Visuals provide narrative and insight. Brent Dykes, Director of Data Strategy at Domo, writes, “Ample context and commentary is often needed to fully appreciate an insight. When visuals are applied to data, they can enlighten the audience to insights that they wouldn’t see without charts or graphs.” In addition to telling an effective story, visuals enable us to process ever larger pools of data. We can process information much faster than we can read. Scientists at the University of Pennsylvania School of Medicine discovered that the retina communicates with the brain at approximately 10 million bits per second.  We cannot read anywhere close to this fast, however, and to communicate the same amount of information through text takes literally 10,000 times longer (we can read at about 1,000 bits per second). Data is one of an organization’s most valuable assets, and visuals empower us to unleash the value of that asset and analyze more effectively and communicate even more efficiently and persuasively. Most of us will not have access to virtual reality data immersion technology in 2017 that enables us to be surrounded by our data, but we can get more hands-on with the data we use by building effective charts and graphs to analyze it and paint the picture for us (and for our audience of decision makers). In 2017, let the following be our guiding wisdom: Above all else show the data. ―Edward Tufte, renowned statistician and Professor Emeritus at Yale University For more on our Analytics skills training opportunities, check out our Analytics course offerings to see our range of courses—find the training you or your team needs and see when our next classes run (or contact us about private group training). Enroll...

read more

Are You a Good Data Storyteller?

Posted by on Nov 14, 2016 in Analytics | 0 comments

Are You a Good Data Storyteller?

Data talks. It conjures images. It tells stories: of success and ambition, as well as failure and challenge. It looks to the past and to the future, while delivering a clear and accurate now. But data professionals know it speaks best when arranged with a simple visual alongside a cogent message. Visuals show the story of your data, but are your visuals telling the right story the right way? When data storytelling goes wrong—when the visual is too complicated, when there are too many meanings in the graphic, when the message can’t be read quickly or clearly—your audience will not hear the story. And just as a great visual presentation can be remembered for a long time, some stakeholders may take a long time to forget a bad one. Get data storytelling right with our top five tips for making compelling visuals: Imagine the visual first. Think about what your ideal chart for your upcoming presentation would look like: what stands out? What color, shape, or line length best highlights your point? Professionals often make the mistake of forcing their information into a pre-set visual template that never truly fits the story they’re trying to tell. Sketch your visual on paper first, then create the digital version—this helps avoid the distraction of the default settings and options your software will interject. Focus on the audience. Your visual is a vehicle to transform troves of data into a concise picture. This picture should help you get your point across, so consider what your audience will be interested in, what will hold their attention, how quickly they’ll process the meaning, and what might distract them (avoid the latter). Whether they want breadth or depth, construct a visual that will readily highlight what you need them to know. Use the right type of chart. Charts are not one-size-fits-all: line charts are effective for trends over time, but misleading for categorical comparisons. See the following example, which can misrepresent one department’s expenses as a decrease from another’s (e.g., Accounting is a decrease from Acquisition), when the two are unrelated. Stick to contiguous data points for line charts, which highlight trends over time. Use a bar chart to make categorical comparisons, as in the example below. Stick to one message. The power of visuals is that we process them faster—60,000 times faster—than text. Do not risk having your message get lost by trying to say too many things in one picture. Keep it simple; your audience will appreciate it (and they’ll remember it). Eliminate unnecessary details. Ask yourself: is every part of my visual, including the style and colors used, contributing meaningfully to the story I’m telling? Using lots of color, gridlines, or “flare” can be distracting and may actually take away from the impact your data can have. Try a minimalist approach—you may be surprised how simple and clear your data’s story becomes. And a little minimalism can go a long way toward keeping your data details lean and necessary. Storytelling with great data visuals ensures the audience grasps the main point. Software will help make the visual precise, but the best visuals are thoroughly planned before you even turn on the computer. So, grab a pencil and paper, start sketching your next great chart, and tell just one story at a...

read more

Analysis Paralysis: Find Time to Get the Croissant with Your Cup of Coffee

Posted by on Oct 11, 2016 in Analytics, Financial Management | 0 comments

Analysis Paralysis: Find Time to Get the Croissant with Your Cup of Coffee

You’ve just submitted the report that your supervisor wanted on projected HR costs for next quarter. As you lean back to enjoy your second cup of coffee and begin to tackle your next project, an email asking you to run the analysis based on a different set of assumptions pops up in your inbox. So much for getting a head-start on that other project. We’ve all been in this situation at least once. I’ll be the first to admit that in my early years as an analyst, I found myself copying and pasting – creating new worksheets for each request. Soon my workbooks looked like an eight-headed beast that couldn’t be slain, like the mythical Greek hydra. Changing one or two input items yielding different results that management can act upon is referred to as “modeling”. Fortunately, Microsoft Excel’s “Scenario Manager” provides the functionality that allows an analyst to generate these different scenarios with a minimal amount of tedious copying, pasting, and formatting. This function allows the analyst to establish a menu of scenarios that will be used to produce the target value (HR costs in my example) and create a production-ready summary table. Here are some tips for using to help free up the time to enjoy that second cup of coffee. Think through the computations and how you structure that within the worksheet. For example, if I am modeling HR costs based on number of FTEs (employees) and benefit costs (fixed and variable), I put those input items in separate cells. Many analysts will limit themselves to one input item, but Scenario Manager allows for multiple items that can be varied. Use named ranges for the input items that will be changing. Scenario Manager will use these names in the output. If you don’t name the ranges, only the addresses will be used – e.g. “$B$1” – which will only lead to questions and confusion. My personal preference is to use three scenarios. One will represent the status quo and the other two will be aggressive and conservative. Alternatively, you could use the terms Low, Medium, High. Exhibiting more than three will likely generate more questions…which means more work for you! Even though a variable may not need to be changed, I tend to include it in my output report, but the value will remain constant. I do this for a couple of reasons. First, it reminds me of the variables I used to construct the model. Second, it demonstrates to the audience that you are proactive and considering other items that could impact your analysis – even though you are keeping the value constant for the current analysis. Let me share a bit more detail, using the example of HR costs… Below is the section of the worksheet where I put my computations. Notice that I have used named ranges and they are used in the formula in cell B6. Based upon this structure, my scenarios can vary any of the four input items: FTE, Fixed Benefit $, Salary, and Variable Benefit. I am going to run scenarios that change the number of FTEs and Fixed Benefit and keep the other two constant. These scenarios will be (Low) 25 FTE, $250 Fixed Benefit; (Medium) 50 FTE, $100 Fixed Benefit; and (High) 100 FTE, $75 Fixed Benefit....

read more

Open Data: Your Future, Your Now

Posted by on Sep 30, 2016 in Analytics, Grants & Assistance | 0 comments

Open Data: Your Future, Your Now

I spent a lot of time this summer focused on our curriculum for pass-through entities (PTE) and recipients alongside implementation of the DATA Act. This past Wednesday, I took the opportunity to reflect on  these topics while I attended the Data Transparency 2016 + White House Open Data Innovation Summit. I approached my day from the perspective of what can I share with non-Federal entities (NFE). Here are my key takeaways: Open Data in Ecosystems. The question of the day seemed to be, “we have the data, now how do we use it.” From the White House Chief Technology Officer to the most junior participant, there was a strong understanding that open data has to be made and used in ecosystems.  We know that ecosystems are made of people and it’s a good reminder that data by itself is worthless unless it’s used. That means you need to seek ways to share your data, and find users. It also shows how important leadership, communication, and collaboration skills are in this space. NFEs can (and should) participate in DATA Act implementation efforts. In case you didn’t know, gov is in beta testing now.  I was able to add my small suggestion of what I want changed and then I had a great conversation with some members of the USAspending.gov team. I found out that I could help with some of the beta testing….and so can you. Think about what you’re not getting out of the data now, log onto the beta site, and be an active stakeholder.You also still have time to participate in the Section 5 Pilots. The HHS DATA Act Program Management Office (DAP) continues to seek feedback on the proposed improvements on the Federal Financial Report, Single Audit, Center for Drug Evaluation and Research library, Notice of Award, and Learn Grants. So if you have any issues with definitions or forms, speak up. So many resources are available to use now. One thing that I have found hard is figuring out what to do about the DATA Act now, besides keeping up-to-date and participating in pilots, if you’re a NFE. Part of the impact of the DATA Act and the open data movement in government is already available in the form of publicly available information. NFEs need to learn more about these data sets. Many of us know that the Census Bureau has many data sets available for you to use in your grant applications and project approaches. Or if you’re a pass-through entity, information to help inform your subrecipient risk assessments. Here’s some of what I found just by walking around the information tables: gov – 200K+ searchable data sets (start here) Global Innovation Exchange – Brings together information on funders, innovators, and resources in the development area; and not just US sources EPA Environmental Dataset Gateway – Information on climate change resources, maps and other information supportive of research and environmental justice Transportation Secure Data Center – Wide range of transit, energy, and homeland security information in one place With so much data available and crisp fall air moving in, pour yourself a nice cup of tea and explore visualization tools and resources to increase your knowledge of the varied uses of...

read more

EEOC Executive Leadership Conference Wrap-Up

Posted by on Sep 29, 2016 in Analytics | 0 comments

EEOC Executive Leadership Conference Wrap-Up

Last week Management Concepts facilitated a workshop on data visualization at the EEOC Executive Leadership Conference. It was great to see so much interest in the topic since organizations are faced with more data than we sometimes know what to do with. Because we process graphics much faster than text (60,000 times faster) , creating visuals to convey information in the data is critical to making the most of the data you have. I wanted to share some key takeaways based on themes that resonated most in the workshop discussions. Make one key point per visual. Good data visuals help communication be clearer and more efficient. Limit yourself to one key point per chart—this makes your point obvious and avoids the risk of your message getting lost in the visual, which is what you’re trying to avoid by summarizing the data visually in the first place. Use the right chart type for the point you want to make. Different chart types effectively convey different messages. For example, if you want to compare two data sets, use a bar chart or bullet chart to show progress toward a goal or budget. For the EEO space, this could include progress toward placement of at-risk employee demographics, or progress toward grievance cases resolved. Use relevant metrics and be transparent. Metrics such as pay by gender and total charges are meaningful, but they’re always only part of the picture. Qualify what’s behind the metric and the context for the data. For example, EEOC publishes litigation and charge counts. Observing that charge and litigation count in a particular category has decreased or increased is only part of the picture. Workforce growth or culture factors can shape whether problems are more or less likely to occur, so be sure to explain the narrative around the data. Be tuned in to the audience’s expectations and manage appropriately. The objective of using visuals is to communicate data effectively, so if your audience is expecting to see the data in a specific format or chart type, don’t stray too far from that even if you’re trying to improve the accuracy or effectiveness of the chart. The best chart is the one your audience understands, so it’s a chart used in a regularly released report, for example, consider modifying an existing charts rather than wholesale replacing them with a new one that may fall short of expectations. Use colors with reserve. Your agency may have a color palette typically used in formal presentations or reports, but this doesn’t mean you have to use it in your charts. Most of the time, the chart elements and axes sufficiently communicate the quantitative information, and black or gray ink is sufficient. Colors can convey qualitative distinctions but aren’t precise for conveying quantitative information. They also can have unintended meaning—red, for example, is often interpreted as negative; yellow, as requiring...

read more

How Much Data Do I Need?

Posted by on Sep 9, 2016 in Analytics | 0 comments

How Much Data Do I Need?

As an analyst and evaluator, the most frequent question I am asked is, “How much data do I need for this analysis?” You need to have enough usable, representative data to answer your questions. Here’s how you get there. Representativeness It’s common to focus on a response rate for a survey or count rows of data as a way to show we have enough data. When I teach Management Concepts five day Analytics Boot Camp course, I conduct an activity using marbles to illustrate the importance of knowing how representative your data is of your whole target group. If I have all solid colored marbles, but I know there are cats-eye marbles in the bag, I might have to keep selecting marbles to find one. If they’re all at the bottom of the bag, and I don’t ever see them, is that okay?Here are some things you should find out before you collect any data, whether from people or from databases: How many should I have in my target group? How many do I have in the sample I’ve selected? What are the characteristics of the target group? (age, education, years of service – if you can imagine a category, think about what the possibilities are within each category and write it out) Do those look the same, proportionately, between my sample and the target group? If someone were making a decision about me based on this information, would I be comfortable with it? What might cause me to be concerned? What constitutes “enough” data in this situation? (Hint: Ask your stakeholders before you start collection.) These analytic questions apply whether you are conducting an employee satisfaction survey or pulling large volumes of data from a government database. In many cases, you can’t get everything you want, and having all the data may be impossible (think about Department of Labor datasets, for example). I often see people collect data that doesn’t look at all like they intended – they want a cats-eye marble and only have solid color marbles. It’s hard to answer anyone’s questions when you don’t have what you need. Just because you have access to some marbles doesn’t mean they are the marbles you need for your analysis. Usable Data Not all data is clean or good for analysis. People responding to surveys may skip questions, leave out information, or answer in a way that is unclear. We can’t guess what they meant, so that data has to be excluded from analysis. In the cleaning and screening process (where analysts look at all the details and decide what can actually be used), in some cases as much as 20% of the data is unusable (depending on how it was collected) and that  impacts representativeness. That means that if you thought you collected enough data, you now may find that it’s not enough because some of it can’t be used due to being incomplete, error-laden, or  other problems (such as a software error). You have to plan ahead for some of the data being unusable for reasons that are beyond your control. Tips for Collecting the Right Amount of Data Know what your data should look like when you get it (based on the categories that describe your source – like age, total number of employees, ethnicity,...

read more

Top 5 Questions from Our DATA Act Webinar

Posted by on Aug 22, 2016 in Analytics | 0 comments

Top 5 Questions from Our DATA Act Webinar

With so many questions asked at the end of our DATA Act webinar last week, we were unable to answer them all on the spot, but here’s a follow-up on the top five questions asked. (If you were unable to attend the webinar live, access a recording here.) To whom does the DATA Act apply? The DATA Act applies to all Federal agencies. By May 8, 2017, all Federal agencies will be required to report their spending information in compliance with the new data-centric reporting structure. No later than August 7, 2018, OMB will decide whether DATA Act standards apply to all grantee and contractor reporting—OMB will submit a report to Congress one year prior on the two pilots (one for grantees, one for contractors) to help evaluate the decision of how far to extend the DATA Act requirements. Will FFATA go away? In a word—no. The DATA Act is actually just an amendment to FFATA. If you’re within a Federal agency, some of your reporting processes and requirements will change. Exactly what will change varies by agency. What are the pilots and will I hear about the outcome of the pilots? Section 5 of the DATA Act specifies that two pilots must take place. OMB is tasked with leading the pilot for contractors to identify recommendations and test processes. They are working with HHS to implement the grants pilot, which will test several reporting processes and use of the Common Data Elements Repository Library. OMB must submit a report on the pilots to Congress by August 7, 2017. Is my agency already compliant with the DATA Act? Ask around within your agency to find out where you are in the process since it’s different for everyone. Your CDO, OIG, CIO, or CTO offices are likely to be involved, so communications from those offices are a good place to start. The DATA Act Information Model Schema was just finalized in May 2016, so it’s not likely for anyone to be in full compliance yet. In November of this year, the Inspectors General office of every Federal agency is required to submit a report on implementation progress. Full compliance is not required until May 8, 2017. Will Management Concepts offer additional updates and resources? Absolutely! Stay tuned for future blog posts on data security, legislative updates, DATA Act implementation milestones, and more. To support your work with Federal financial data, browse our training in data analysis, Federal financial management, grants management, or contracting. For support with leading and managing implementation, look to our IT PM training or change management training or consulting. If you have specific questions, please don’t hesitate to reach out at...

read more

Picture This: Data Visualization

Posted by on Aug 12, 2016 in Analytics | 0 comments

Picture This: Data Visualization

Three Reasons You Need to “Visualize” Your Data In an earlier blog post, I wrote about how an evidence-based culture supports making data-driven decisions. As I was reviewing our data visualization course this week, however, I was reminded of why it’s so important to communicate clearly about the information your data surfaces. Communicating about data requires us to look beyond the raw data and find the story within it. Whether you use a chart, slide deck, or run-of-the-mill Pivot Table, clearly conveying your data insights can go a long way toward facilitating data-driven decisions and making a habit of it. First, data keeps us honest. Without data it’s easy to speculate and hypothesize about what’s really going on. Recent inspections within VHA provide a great example here. It was known that there were delays in scheduling, and the inspections revealed that in most cases the delays were due to a shortage of specialty provider care rather than problems within the scheduling process itself (which had in the past been speculated as the root cause of the delays). Data is often counter to our intuitions. This is clear when I look at my own to-do lists. If I have a mile-long to-do list, it’s easy for me to feel like I’m not being productive. In reality, focusing on what remains to be done averts our focus from what’s already checked off the list. I don’t build a bar chart to compare the number of tasks I’ve completed to the number that remains, but I bet if I did I might be surprised. The same is true for monitoring performance at the organization level—sometimes we need the reality check that data provides us, to show us that reality is better (or worse) than we expected. Finally, data raises questions just as much as it provides answers. I can look at data in a variety of formats—a table, a list, or charts—and all will help tell the story behind the raw data. In doing so, it inevitably invites me to probe deeper. Once I have the data behind one question, I usually have follow up questions in droves. Putting some data on the table is often the first step in building an evidence-based culture that routinely makes data-driven decisions. In your work, consider how you can communicate clearly about data. A picture is worth a thousand words, and visuals help convey qualitative and quantitative data. I often find that charts, color-coding for good vs. bad vs. neutral, or tables help highlight the main points I want to make. Consider what media and format you use to convey your data, and where you’d like to reap the benefits of an organizational culture that prioritizes data-driven...

read more

People Analytics Is Trending Up

Posted by on Jul 28, 2016 in Analytics | 0 comments

People Analytics Is Trending Up

This summer, I have been watching a lot of baseball. The sport has always captured my attention because it is highly strategic game. Baseball managers are faced with hundreds of analytical decisions each game, including who to start, when to substitute players, what pitches to throw, how to align the defense based on a hitter’s tendencies, and how aggressively to approach base running (just to name a few). Baseball franchises have increasingly turned to analytical models in order to make more informed decisions in constructing rosters and projecting likely outcomes on the field. In fact, most of the top professional baseball franchises now employ teams of analysts who use empirical data to look for competitive advantages with personnel and financial decisions. These types of decisions are at the heart of people analytics. People analytics is a rapidly growing branch of data analysis that is particularly useful in the field of human resource management. It involves a thorough analysis of organizational “big data” to discover how employees, processes, and teams are performing, and how they are likely to perform in the future. People analytics provides a data-driven way to make sense of complicated workforce issues like employee satisfaction, retention, individual and team productivity, recruiting, marketing, sales, and compliance. According to Deloitte’s Global Human Capital Trends 2016 publication, recent investments in people analytics are starting to pay off for businesses around the world. The number of businesses that report being ready to incorporate people analytics has increased by a third (24% to 32%) and the number of businesses ready to develop predictive models of performance has doubled (4% to 8%). According to Deloitte, these advancements in people analytics capabilities are attributable to the increased availability of integrated systems and the willingness to invest in building analytics teams. Though significant gains have been made in this field, there is still room to improve for most organizations. The report details that 62% of organizations surveyed rated themselves as “weak in using big data for recruiting.” Additionally, more than half of respondents reported using HR data inadequately in predicting workplace performance and improvement. However, the report predicts dramatic improvement in these categories in the upcoming years, as technology continues to improve to meet business needs. Just as baseball franchises search for a competitive edge in building a winning team, organizations across the globe are investing in human resources to make data-driven decisions about how to best manage their employees. People analytics allows organizations to hire, retain, and motivate the right people to achieve their business...

read more

Why Don’t We Make More Data-Driven Decisions?

Posted by on Jul 7, 2016 in Analytics | 0 comments

Why Don’t We Make More Data-Driven Decisions?

As of this writing, Data.gov hosts 184,055 datasets. We have petabytes upon petabytes of data, yet struggle to find the best answers for common problems like employee engagement, program schedules, and budget forecasts. In a recent panel on data-driven government, public and private sector leaders alike agreed that to use our data effectively it’s imperative that organizations apply good data governance strategy and processes. While this is true (more on that in a future blog post), whether you make data-driven decisions is also driven by culture—ironically, a factor which, unless you have data on it, is unruly to address. Three themes can help you identify where your team or organization is when it comes to making data-driven decisions. Where do you fit in? Do you have an evidence-based culture? In Carl Anderson’s Creating a Data-Driven Organization, a hallmark of data-driven organizations he identifies is an evidence-based culture. Evidence-based cultures prioritize having data to support major decisions, open sharing of the rationale for decisions, and democratic distribution of information. How mature are your analytics processes? In 2008, the SAS Institute parsed out eight levels of analytics maturity within an organization, which provides a good gauge for how habituated your organization is at making data-driven decisions. If you use data mostly to understand what is happening, to understand ad hoc issues, or to tabulate requirements, you’re likely on the low end of the spectrum for capabilities to enable data driven decision-making and build an evidence-based culture. To promote evidence-based thinking and analytics maturity, OMB provides organizational incentives to departments that integrate evidence into their reports. This extends the benefits of evidence-based thinking beyond the inherent value to incentivize behavior changes that can build new habits. Are you interacting with your data? Data is not information. That bears repeating: data is not information. Once you process the raw data, clean it, organize it, interpret it, and summarize it, leaving it in a data file is not sufficient for driving a data-driven culture. We are more likely to remember information and apply the data the more we interact with it—a concept known as disfluency. We need to work with data enough for it to challenge our assumptions, introduce new findings, and extract information it contains. Print it out, hang it up, view it in multiple formats and on more than one occasion, create a lab or workspace where people can interact with the data sets. Data governance and effective data management establish a framework and infrastructure that support collaboration across data, but culture is important, too. To drive data-driven decision making in your organization, team, or even in your individual role, consider where you identify with these cultural indicators. Develop your “culture” infrastructure for data analysis along with the technical infrastructure....

read more

Breaking Down Big Data

Posted by on Jun 3, 2016 in Analytics | 0 comments

Breaking Down Big Data

The term “Big Data” is thrown around a lot in the IT and data analytics communities. Managing the size and complexity of available data has become a primary challenge for systems administrators, IT professionals, and data analysts alike. For those of us used to interacting with only a few hundred records in a data set, the idea of “Big Data” can seem overwhelming. However, the exponentially growing availability of “Big Data” provides government and business professionals with unprecedented access to information that can lead to better decision-making. What is Big Data? “Big Data” refers to large, complex data sets that require processing beyond the normal capacity of a storage system. Advances in technology have allowed for greater collection of data across all elements of human life. There is an ever-increasing pool of resources available to measure societal patterns and trends, as well as the personal data—web histories, social media accounts, commercial transactions, etc.—we produce each day. This compilation of Big Data is usually measured in petabytes or exabytes of information, and cannot be easily analyzed or processed by normal software programs. This leaves organizations with the task of analyzing a massive collection of data with varying degrees of structure. How is Big Data being used in the Federal government? In 2012, the Obama administration announced the “Big Data Research and Development Initiative,” aimed at improving the how the Federal government collects, analyzes, and interprets Big Data. The larger goal of this initiative was to empower Federal agencies to more efficiently organize and communicate data trends in the fields of science, education, and national security. In addition to the six agencies provided funding for the president’s initiative, a number of government agencies have used “Big Data” to find innovative solutions to problems central to the agency mission. Why does Big Data matter?                                                Decisions are more effective when they are data-driven. Having comprehensive processes for managing “Big Data” allows an organization to convert raw, unorganized data into actionable information in real time. Organizations can use this information to: Perform risk analyses Protect data security Reduce the cost of data storage Develop predictive models for future events Improve operational efficiencies Quickly identify problems and take corrective action Collaborate with other organizations toward achieving common goals The increasing prevalence of “Big Data” provides Federal agencies with challenges and opportunities. Effective data analytics processes allow us to turn seemingly unmanageable data sets into useful and useable...

read more

DATA Act – Federal Countdown to May 2017

Posted by on May 31, 2016 in Analytics, Workforce Management | 0 comments

DATA Act – Federal Countdown to May 2017

Last week my colleague Kim Coelho brought us up-to-date on the data standards updated by Treasury last week. This hopefully will allow movement on DATA Act implementation. Several agencies reported to the Government Accountability Office prior to April that their progress was hindered due to the lack of a final DATA Act Information Model Schema (DAIMS). Any delays are concerning at this point as the statutory deadline for most requirements is May 2017. It may be that the release of the DAIMS will enable those people in the weeds of DATA Act implementation to make definitive changes in the business systems and processes that many Federal employees interact with daily. For some, there will be definite differences, like changing the structure of an agency’s award numbering system to meet the governmentwide requirements. So what should you do now? Educate yourself. The GAO released this infographic in January 2016 to provide the 50,000 foot level view of the DATA Act. It starts with the point of the law – “Citizens want to see how Federal money is spent.” Determine if you are involved. Many of us hear data, and immediately think that the people involved are mainly going to be someone from the department’s IT team. After all they design and build the systems. While that may be the case, we recommend reviewing the Office of Management and Budget (OMB) ManagementProcedures Memorandum No. 2016-03. You may find that your office may need to adjust its award making processes to meet the requirement of matching financial assistance awardee names identical to those entered on SAM.gov. Budget time, people, and resources accordingly. Because the DATA Act is not new, most agencies should have included implementation activities as part of their annual budget requests. Now that Treasury has published the DAIMS and Congress is still working on FY 2017 appropriations, agency officials can take time to reassess if these requests have changed. Still feel like you’re deep in the world of technical mumbo-jumbo? Then it’s time to take the long view of the goals of the DATA Act – transparency on how our taxpayer dollars are spent. We expect that as a result of these changes, Federal departments and agencies will need to meet increased expectations on what information they can and should report. That means an increased proportion of the workforce needs analytics skills so they approach, analyze, and visualize their data effectively for the public. This summer, Management Concepts is hosting a webinar on the DATA Act and data transparency in the Federal government. Stay tuned to learn about how the DATA Act will impact your role and the wider Federal...

read more

DATA Act Standards Released: What’s the Impact?

Posted by on May 17, 2016 in Analytics | 0 comments

DATA Act Standards Released: What’s the Impact?

At the end of April, the Treasury Department released the fourth and final iteration of the data management standards called for by the DATA Act. These standards lay an important foundation for the rest of the Federal government to implement the DATA Act, which the Office of Management and Budget and Treasury are charged with executing. The DATA Act Information Model Schema (DAIMS, v1.0) presents data exchange standards, refined over the past year based on feedback from agencies and public stakeholders. The DAIMS includes standards for the flow of spending information among the Federal government, the specific data elements offices are required to include in spending reports, and definitions for types of spending data that reports will include. Technical diagrams and the XBRL schema file are also available. The DAIMS, when fully implemented, will enable consistent reporting across the agencies into a central Federal spending database. It will make compiling the data more reliable and efficient, and enable better access to the same types of Federal spending data across the agencies. Having this blueprint and schema file are important steps in supporting the DATA Act’s vision of greater transparency and access to consistent Federal spending data. Nevertheless, much work remains before the Act’s requirements will be fully realized. This summer, Management Concepts is hosting a webinar on the DATA Act and data transparency in the Federal government. Stay tuned to learn about how the DATA Act will impact your role and the wider Federal...

read more

Myth or Reality? I Don’t Need Data Analytics for My Job

Posted by on Apr 12, 2016 in Analytics | 2 comments

Myth or Reality? I Don’t Need Data Analytics for My Job

Data is exploding, especially in the Federal sector. Yet, the increase in data analytics has come with an increase in specialized data analyst roles. So, is it a myth or reality that you don’t need data analytics for your job? It’s a myth! We all need to be—and can be—our own data analysts. Here are some common misconceptions that drive this myth, followed by ways you can use data analytics in your work…even if your title isn’t “Data Analyst.” Data analysis is a niche skill set, only for elite statisticians. While it’s true that it takes years of training and practice, knowledge of technical systems, or fluency in programming languages and complex databases to perform some forms of data analysis, common data analysis tools make it easy for us all to be our own Data Analyst for many day-to-day needs. What you can do: If you’re new to data analytics, turn over a new leaf. Tools such as Excel and Tableau are written to be user-friendly. You don’t necessarily need to be technically savvy. You just need to be willing to learn a few new functions. Computers can perform data analysis for me. Software and computing power are useful tools for performing analysis, but humans still need to guide the analysis, to make decisions about what data to use, to interpret the data, and to present the data. Computers have not replaced us. What you can do: Learn what software you have access to—you might be surprised. Consider training on basic functions, and be amazed at what you can learn in even one or two days to take advantage of analysis tools in common software such as Excel. My job doesn’t require me to analyze data. Demand for data-driven decisions is on the rise. Data provides insights into team and organizational performance, project performance, budget utilization. Due to the DATA Act, all agencies will soon be required to provide data analytics as budget justifications, creating a significant demand increase for performance analytics. What you can do: Become a trusted resource on your team, and catalyst data-driven decisions in your organization. Develop a basic understanding of analytics and how data drives decision-making in your organization, and begin using data to understand and drive performance. But I don’t work with data at all. You still need to be able to understand data that impacts your job. Your team’s budget, performance benchmarks, goals, and your personal performance appraisals all are shaped by data. What you can do: Be a smart consumer of data analysis. Cultivate a foundation in analytics so you know how to interpret data that’s presented to you. Learn how to ask questions of data, how to understand analysis assumptions, and which claims data can and cannot validly support. Even if your primary job function isn’t data analysis, you already work with data on a regular basis. Whether it’s tracking the number of program tasks complete, analyzing larger data sets to inform planning, or simply asking questions of data that’s presented to you, we all need data analytics. Beyond the workplace, we’re continually presented with data. The need for data analysis skills is not just a valuable professional competency—it’s necessary to be a critical consumer of information. So whether you’re new to analytics or experienced—if your job is in data...

read more

Understanding Data for Better Analysis

Posted by on Mar 21, 2016 in Analytics | 0 comments

Understanding Data for Better Analysis

What Exactly Am I Looking at? Data analytics has exploded: We have more data at our fingertips than ever before. And we can do more with data than ever before. Demonstrating data leadership in 2016 requires that we better understand the data we have at our disposal, and that we are aware of institutional priorities that drive the questions we ask of our data. We must recognize that learning more about data—and opening up our data to collaborative engagement—can have tremendous impact on the quality of our analysis. Translating massive amounts of information into understandable analysis is no small feat—especially in the Federal government, which deals with data from millions of sources—but we can start by asking ourselves a few simple questions to better comprehend the data we have on hand: Am I pulling the right data? Meaningful data analysis hinges on examining the right data set, which in turn largely depends on understanding why you are pulling data in the first place. Think about the issues or questions that your team is trying to address, and check in with your manager about how your department intends to use your analysis. Being cognizant of these underlying priorities can help you determine whether the data you are pulling is mission relevant. Is this data sufficiently valid? Because the government deals with terabytes of data on a daily basis, it is extremely difficult—if not impossible—to get a set that is complete, accurate, and free of error. Do not aim for a perfect data set; instead, determine whether your set is sufficiently Ask yourself whether it speaks to the questions at hand and will help you complete your work in a timely way. What exactly am I looking at? Take some time to understand the variables in your data set and the types of data in play. This is important because you will have to communicate your own assumptions about the potential trends, gaps, and lapses in your data set. It is also valuable to know whether your data is up to date (or needs to be refreshed), has a set expiration date, or is linked to shared content, particularly if you get data requests from team members, other departments, or the general public. Who can help me? There are many ways to better understand, communicate, and strengthen the work that you do: Get a colleague’s opinion about the implications of a particular trend in your charts; have a conversation with your supervisor about ways to better align your analysis with the team’s priorities; or enhance your analytical capacity. As we move further into 2016, it’s safe to expect data leadership not just at the institutional level, but from individual employees who receive and manage large volumes of data every day. However, before we can get to managing data—to marshaling it for better decisions and improved service—we must first comprehend it. Yes, data analytics has exploded—and if we don’t understand the data that we have, we’re sure to be left in the...

read more

Visualization and Metrics: Using Data to Build High-Performing Organizations

Posted by on Mar 4, 2016 in Analytics | 0 comments

Visualization and Metrics: Using Data to Build High-Performing Organizations

I recently attended the Government Analytics Breakfast forum hosted by Johns Hopkins University where HHS’s Chief Data Officer Dr. Caryl Brzymialkiewicz discussed the impact of data science in the Federal government. The conversation included a wide range of topics from systems structure, performance improvement, and data mapping, yet two things in particular stood out: data visualization, and useful metrics. To use data effectively to understand program performance and allocate budgets, data visualization and metrics are important areas for every agency to invest in. A data pictorial is good if it helps the end user understand and appreciate the story it tells. It’s not enough for data to be accurate and the analysis relevant—it must also be easy to interpret. This enables descriptive as well as prescriptive analytics to have a real impact. To address this, it’s important to be flexible. Agencies need adaptive systems that allow users to grab data visualizations in ways that maintain the integrity of the data but make it “pop” to the end user. Maps are a popular data visualization format, but different formats resonate with different users. There is no “one size fits all.” Regardless of what system or software your organization uses to visualize data, strive to make data analysis results available in a variety of visual formats. In addition to good visualizations, ensure that your analysis tells a meaningful story by using the right metrics. Legacy metrics may have been shaped by what we were able to collect, so it’s important to make sure your data models and metrics reflect current agency priorities and needs. Consider the following: Make sure your metrics are relevant. Do they reflect agency priorities? Are they based on meaningful things? Are they realistic—do they reflect what your data does/can assess? Assess your metrics’ lifespan and be realistic. For how long do you want to use this metric? When will you need to retire it and replace it with something else? What are the drivers that affect when you’ll need this? Consider your stakeholders. What does Congress require for program justification? What are your customer’s goals? Invite your stakeholders to be part of the process of deciding which metrics you use and what benchmarks shape your analysis. The amount of data available to us increases daily, and it’s upon all of us—industry, government, and academia—to partner, continuing to improve data quality and empower high-performing...

read more

Data Analytics in the Workplace

Posted by on Dec 21, 2015 in Analytics | 0 comments

Data Analytics in the Workplace

Did you know that the United State Postal Service (USPS) photographs and documents every single piece of mail it processes? In 2012, this accumulated to 160 billion pieces. This huge data set serves the purpose of facilitating collaboration across law enforcement agencies. It’s just one example of how collecting, analyzing, and reacting to data can lead organizations to make strategic decisions. In our world of connected devices, email, instant messaging, and wearable technologies, collecting data is becoming the easy part. The hard part is analyzing large, complicated data sets in order to improve organizations and employees alike, especially when many tools are designed for use by “data scientists” rather than the everyday user. Nonetheless, even without advanced software, organizations and employees can still use data analysis strategies and techniques in their everyday work. The following suggestions are ways leadership and management can promote data analytics in Federal agencies, based on a report from the IBM Center for The Business of Government and the Partnership for Public Service: Ensure and encourage access to data. Allow and encourage employee access to performance data as it promotes ownership over professional development, transparency, and reminds employees of the agency’s mission. Over time, this will establish data usage as a workplace norm at your agency. Collaborate with other agencies. Look to acquire preexisting interagency data sets, tools, and services that can enhance your agency’s performance. If your agency has found success with certain data sets, tools, or services, consider establishing an agency partnership through a memorandum of understanding so each agency can improve their work. Track the benefits of data analytics. As competition for funding continues, tracking specific outcomes of data analysis initiatives and programs is vital in proving their worth for your agency. Top leadership needs cost-benefit metrics. Present clear and concise analysis. Most data sets are massive, overwhelming, and complicated. Give high-level analysis to stakeholders who are unfamiliar with the data so it’s easier to absorb and understand. Tie the data and analysis to your agency’s mission. At the individual level, you can use the following analytic techniques in your everyday work to improve performance, regardless of whether or not you’re working with a data set: Acknowledge your assumptions. Reflect on your assumptions so you manage their influence on your judgments prior to starting work with a data set or information. Rethink them, not abandon them, to see the big picture. Question your information. Challenge your data set or information to ensure you come to a valid, reliable conclusion. Respect all possible conclusions. Consider all competing conclusions so you’re not simply picking the first one that’s satisfactory enough. All perspectives should be given equal weight in order to find the best conclusion. Be the devil’s advocate. Question the strongly held assumptions and conclusions of those around you working with the same data or information. Look for gaps, assumptions, or moments of groupthink that are being overlooked. Ask, “What if?” Data analysis will continue to be a major influence in our lives in ways that are seen and unseen. In the workplace, employees at all levels can encourage the role of data and use analytic techniques to improve organizational and personal performance. The best solutions and conclusions are made when people have access to data and are empowered to connect the dots....

read more

Pentagon’s Personnel Focus: Analytics

Posted by on Nov 30, 2015 in Analytics | 1 comment

Pentagon’s Personnel Focus: Analytics

There is a growing interest in people analytics in both the public and private sectors. I applaud Defense Secretary Carter’s decision to create an Office of People Analytics as one of the first big steps to take a fresh look at their current and future people situation. Not only is it a perfect example of recognizing a missing link in the department’s Force of the Future initiative, it is a great example of how Ash Carter and his team are thinking differently about the situation at hand. In June I wrote about the Pentagon’s move to reform the military’s personnel system and highlighted a few grounding principles and practices to develop a modernized military talent system. Specifically, I called out data analytics as a critical practice to making successful personnel decisions. Pentagon leaders, like many organizational leaders, have often relied on intuition to be their guide when making people and other decisions for far too long. We are living in different times with all sorts of structured and unstructured data flooding us on a daily basis, but often without a clear vision for how to analyze and manage it in an integrated way – at the same time we are feeling like some important data is still missing. What happens next will be the difference maker for the Future Force initiative. Like building a new house, a new office creates a frame and container from which other things can happen. Some of those things that Secretary Ash Carter and others who have newly created people analytics structures should be thinking about and acting on are: Identifying benchmarks. What are the important benchmarks that will surface up the true state of affairs for the current and future workforce across the full employee lifecycle? Growing analytics talent. What are the needs to train and develop people to perform basic analytics, solve people readiness issues, and rethink the approach to managing talent? Managing the change process. How are change management practices being incorporated to create buy-in, communicate frequently, and identify advocates? Integrating new and old technologies? How can you combine or compare data in new and different ways to get fresh insights? Learning from others. Who can you identify as learning partners as you take new steps to advance the work of People Analytics office? Keeping context in mind. How can the important work done to date not be lost or impacted by a major change such as a presidential transition? If you are looking for support in how to transform your people and other data (old and new) into actionable insights, Management Concepts people and performance consultants can help you build new or different analytics capability and capacity in your organization.  ...

read more

Three Ways DATA Act Standards Will Help Government Act Using Data

Posted by on Sep 18, 2015 in Analytics | 0 comments

Three Ways DATA Act Standards Will Help Government Act Using Data

The standards are set for government data. The end of this summer marks a significant milestone achievement in the push for the Federal government to make more data-driven decisions. On August 31st, the Office of Management and Budget (OMB) and the Department of the Treasury announced that they have officially finalized the 57 standard data definitions written in support of the Digital Accountability and Transparency Act of 2014 (known as the DATA Act). This follows the initial standards rollout from last May. These standards aim to support the government’s initiatives to improve the quality, consistency, and transparency of data reported on USAspending.gov. Of course, as stated in the standards announcement, “the power of Federal spending data is only as strong as the utility of that data.” Here are three ways these data standards will support the government in making data-driven decisions: Put the work they are already doing to good use Current legislation already requires government agencies to collect and report spending information. However, until now, how the information was collected, stored, and formatted was not legislated. Today, much of this information is effectively stored in silos with each department, agency, and program using separate reporting systems and databases for their various types of data from budget information to contract spending. Leverage data from one another With agencies reporting their data using the same format and system as all other government agencies, the government will be able to speak a common language. What’s more, agencies will be able to access and understand each other’s data. There will be consistency in how agencies are looking at data. This will position agencies to make the best use of all the data they now have at their fingertips. For example, an agency can compare their project plan with data from other agencies that have conducted similar projects to help inform budget estimates, schedules, anticipating risks and results, and more. Evaluate what happened and plan to improve Standardizing how government reports spending data will provide a better degree of clarity for reviewing the data. It will also enable agencies to access new and better tools for analyzing the information in order to best understand the complete picture of what they are doing and how it aligns with policy goals. With that in mind, agencies will find they are able to improve the accuracy of predicting their results for syncing their planning to department and administration goals. And these are just a handful of the possibilities! It’s exciting to think where the government could be—not to mention interested parties like taxpayers, grant recipients, and businesses that rely on this information—two years from now when all government agencies are expected to have adopted the standards for their financial reporting. What do you think the future holds for government decision making given these new DATA Act standards? Share your thoughts in the comments and be sure to subscribe to Perspectives for more analytics blog...

read more

Analytics.usa.gov and Federal Human Capital

Posted by on Mar 23, 2015 in Analytics, Human Resources | 0 comments

As part of Sunshine Week, the General Services Administration’s 18F Team unveiled a new real-time analytics dashboard at https://analytics.usa.gov/ that, according to the site “provides a window into how people are interacting with the government online.” Already, experts are gathering insights from the data that will both inform improvements to Federal customer service and shape the next generation of open data initiatives and dashboards to support data-driven decisions across the Federal government. By now, you might be wondering, why someone with an interest in human capital, leadership development, and organization development is blogging about a web traffic analytics platform. And, when I first stumbled on the site, while I found it interesting in the “I’m a data nerd sort of way,” it wasn’t until I read the blog on how the site was built that I connected the dots. The dashboard is completely open source, uses publically-available data (that can be downloaded directly from the page) on commercial platforms (Google Analytics, Amazon S3, and Amazon CloudFront), and was built in 2-3 weeks. In an environment that is, at times, criticized for inefficiency, unnecessary bureaucracy, and being behind the times, the 18F Team and their partners in the Digital Analytics Program demonstrate how unique approaches to organizing and managing work can foster innovation, even in large Federal Agencies like GSA. Here are a few characteristics of the 18F Team that could be adopted in other agencies to help drive innovation: Leverage diverse skill sets and backgrounds – the 18F team is a group of “doers recruited from the most innovative corners of industry and the public sector.” Having multiple perspectives and backgrounds on the team contributes to creativity, helps solve difficult problems, and builds a sense of inclusion that fosters sharing. Find people who are passionate about the mission – Along with multiple perspectives, the 18F Team is united around a shared passion for driving efficiency, transparency, and savings for government agencies. Passion builds commitment and energy that can increase engagement, discretionary effort, and encourage productive risk taking. Look for quick wins and build on them later – The team acknowledges that analytics.usa.gov is just the first step and that the lessons they learn and feedback they get as the public interacts with the data will improve future dashboard products. Finding ways to rapidly introduce change that include a plan for gathering and exploiting lessons learned can be a strong springboard for larger innovation and change efforts. Build from areas of strength – The analytics platform uses existing technologies, initiatives, and capabilities within GSA to provide a new way of interacting with the public and improving agency decisions. Every agency has pockets of excellence and strengths that can serve as a platform for experimentation and evolutionary (rather than revolutionary) change. Building from strength increases the likelihood of success while at the same time celebrating what’s going well within the agency. The success of the 18F Team and analytics.usa.gov (which is only one of the team’s projects) provides a great model for innovation in Federal agencies. What other lessons learned do you see in the roll out of this new...

read more

Help Your Federal Team Hit More Home Runs

Posted by on Oct 23, 2014 in Analytics, Human Resources, Leadership | 0 comments

According to a new report by the Government Accountability Office (GAO), performance information’s effect on Federal managers’ decision-making has remained largely unchanged in six years. Despite the increase in the amount, variety, and availability of performance data and analytics tools to drive decisions, performance data’s promise in the Federal Government has not yet been realized. There are many theories as to why this is the case, but I would argue that the shift to data-driven decisions in the Federal Government requires disruption of the mental models most commonly used in making decisions. Despite the Nats’ absence from the World Series this year, Major League Baseball serves as a great example of how disrupting a mental model may enable a leader to improve decision-making and organizational performance. One recent revolution in the baseball world is the now-accepted practice of applying empiricism and analytics to performance data, to drive strategy and tactics. Analytics helps baseball executives find hidden value in player effectiveness and situational game tactics that lead to the ultimate criteria of success: wins on the field. The rise of this new field — Sabermetrics — pioneered by Bill James, has been well-documented in Michael Lewis’s 2003 book-turned-movie, Moneyball. Embracing this approach — gathering data and performing analysis to determine what skills and behaviors contribute to wins on the field — is one of the things that help teams from smaller markets bring greater competitive parity to the game. For over a century, baseball talent evaluators relied on conventional wisdom to assess players using generally accepted criteria regarding the “5-Tools” of baseball: running speed, arm strength, hitting for average, hitting for power, and fielding. During a player’s early career, scouts would assess all players on these traditional success criteria. Usually these assessments were made simply by watching the players perform. Little effort was made to systematically gather data in a way that permits players to be compared. Despite these haphazard attempts to measure raw skills, some players who excelled on the 5-Tools metrics were not able to perform under game conditions to produce wins. This indicates that while raw tools were important, they were inadequate as sole predictors of success. In addition, baseball’s conventional wisdom — informally referred to as “the Book” — around game tactics: when to bunt, steal a base, position the defense for certain hitters, and even make player substitutions has relied on time worn, but not necessarily rigorously tested presumptions about what actually produced wins. By asking the same questions for over a century, baseball scouts and executives relied on consistent criteria, analyzed in the same way, to make player assessments. Over the last 15 years, more focused measurement, the rise of behavioral economics, and the improved willingness and ability to perform statistical analysis are testing these assumptions and causing new wisdom to be applied to baseball. This approach is yielding greater insights into what skills, behaviors, and tactics lead to team wins. While 15 years ago very few teams would have been aware of or invested in this approach to discovering and benefitting from objective truths about baseball, now every team has staff dedicated to measurement and video and statistical analysis to identify and leverage a winning edge. By scouts and executives shifting their viewpoint, asking different questions, measuring performance differently, and then performing analysis on the data they measured, they...

read more

I’ll take Cognitive Analytics for $1000, Alex

Posted by on Jul 16, 2014 in Analytics, Human Resources, Leadership, Project Management | 0 comments

One of my fondest memories from my childhood is my family’s nightly ritual of gathering around the TV to watch Jeopardy! with Alex Trebek. I’m still a big fan of the show and when, in 2011, IBM”s Watson took on two Jeopardy champions I was captivated. Having worked on some early efforts to use Natural Language Parsing (NLP) and Latent Semantic Analysis (LSA), it was great to see how the technologies had advanced to allow querying of large sets of unstructured data using plain language queries. Watson is just one, impressive, example of the growing field of cognitive computing and cognitive analytics.  Cognitive analytics refers to process of bringing together machine learning, natural language processing, and artificial intelligence to analyze large quantities on unstructured data in ways similar to those used by the human brain. According to Deloitte’s Tech Trends 2014, “cognitive analytics relies on technology systems to generate hypotheses, drawing from a wide variety of potentially relevant information and connections,” and the emerging technology will be a growth area for many organizations in 2015. In honor my favorite game show, I thought I’d provide a Jeopardy style list of ways Federal managers may, in the not too distant future, be able to use cognitive analytics to improve organization performance. Answer: Natural language search agents. Question: What tools can government agencies use to improve customer service in a resource constrained environment? Apple’s Siri is arguably the most familiar version of an artificial intelligence-based natural language query engine, but corporations have been introducing more rudimentary versions of automated agents to support customer service for nearly a decade. With enhanced language understanding, the introduction of machine learning that can improve the recommendations coming from search agents, and more access to information storage and processing power, opportunities are increasing to automate customer service functions. Cognitive analytic techniques will enable systems to interpret and connect disparate pieces of information to provide better answers and more resources in response to customer inquiries. Automating elements of customer service processes (for both internal and external customers) helps support the agencies drive to maintain service levels with decreased resources. The key will be implementing the type of technology users are becoming accustomed to (e.g. Siri), while still allowing for easy access to customer service agents before users become frustrated with the technology experience. Answer: Social media monitoring and sentiment analysis. Question: How can a federal manager use cognitive analytics to understand trends in employee engagement and brand management? In an increasingly connected world it is important for organizations to maintain awareness of how they are perceived on social media. Using analytic tools, agencies can monitor, aggregate and analyze trends in messaging on internal and external social media networks to understand how employees and the public view the organization. And emerging technologies for sentiment analysis offer a glimpse into the positive or negative views being communicated about the organization. Answer: Better data aggregation, improved used of unstructured data, and faster data processing. Question: How can cognitive analytics enable data driven decision making at my organization. A recent survey by MarkLogic and GovLoop (here) suggests that many government agencies are struggling to realize the benefits of big data and advanced analytics for their organization. The merging of advanced technologies in the field of cognitive analytics will offer agencies a...

read more

Sometimes Smaller is Better: Starting an HR Analytics Program

Posted by on Jan 16, 2014 in Analytics, Human Resources, Leadership | 0 comments

Last week, I blogged about the emerging skillsets required in the HR function for introducing analytics and data driven decision making to the HR practice. Even with the right team in place, it can be daunting to launch your first analytic study. Much has been said about the importance of data driven decision making for HR. The early results suggest that organizations who are adopting data analytics to support HR decisions are reaping the benefits. However, as the resources available to government agencies continue to be stretched thin, implementing analytics programs can seem like an impossible task. But there is good news if you are the CHCO of a small or mid-sized organization – not all analytics programs have to be complex and costly. Instead, sometimes smaller is better when it comes to your first foray into the world of workforce analytics. Here are a few strategies your organization can take to start an HR analytics program and reap the benefits of data driven decisions. Go Small to Go Big One of the most significant challenges in creating an analytics program is to conceptualize and implement big data tools and methods. So, instead of trying to build and deploy a comprehensive program, find some small wins where a limited and readily available (or easily collected) data set can provide solid evidence that will improve your decision making. Applicant tracking systems, learning and performance management systems and intranet sites that are all present in most organizations may provide valuable insight on trends, issues, and organizational needs that can be collected and evaluated with minimal investment. Use Data to Confirm (or Disprove) Your Intuition or Hypotheses While the ability to improve workforce decision making through data collection and analysis is indisputable, a hallmark of effective leadership is the ability to make sound decisions based on experience and intuition.In the rush to utilize big data and analytics it can be easy to overlook your own successful record of making good decisions in the absence of data. So, don’t throw the baby out with the bath water. Instead, focus your data collection and analysis efforts on gathering data that can confirm or disprove a hypothesis you already have. Chances are, if you’re plugged in to your workforce, you really do know what’s going on in the organization and what it takes to be successful. Devise a data collection strategy that is targeted to address areas where your intuition tells you more data is needed while moving areas where you have a high degree of confidence lower on the priority list. Of course, there are risks with this strategy – you might be ignoring blind spots or over estimating your grasp on the organization and its challenges. However, with limited time and money to invest in data driven decision making, prioritizing your investments is a necessity, so trust the wisdom gained from experience to guide you in the right direction. Start With the End in Mind  You will experience a strong temptation to go fishing for what your data can tell you.. As more and more is written about the benefits of data driven decision-making, resist the urge to invest time mining data in an undirected exploration.Instead, take time upfront to carefully consider the challenges your organization is facing and how increased data may...

read more

Are You Ready for Data-Driven Decision Making in HR?

Posted by on Jan 9, 2014 in Analytics, Human Resources, Leadership | 0 comments

Are You Ready for Data-Driven Decision Making in HR?

A few weeks back, Management Concepts released the white paper Federal HR Trends in FY14, our take on the five trends we believe will shape the Federal human resources and human capital space this year.  In this blog, we’ll explore the third item on our list: Data Analysts in HR. It’s not news that the rise of big data is a leading story in the field of human resources or that the push for HR departments to embrace data driven decision making strategies is a major focus across the industry. Much has been said about the importance of analytics for HR and the early results suggest that organizations who are adopting data analytics to support HR decisions are reaping the benefits. As government agencies continue to feel the pressure to optimize their investments, the push for data driven workforce decision making will continue to mount, while the resources available to implement analytics programs are likely to continue declining. The focus on analytics, combined with growing (and perhaps unrealistic) expectations about the benefits of big data and mounting pressure to make use of available data from HRIS are combining to create a high pressure environment where CHCOs are being pushed to grow their ability to apply quantitative analysis techniques to support HR decision making. Introducing analytics to the Human Resources Line of Business (HRLOB) will require HR personnel with a set of skills that has not traditionally been part of the human resources function. To ensure your organization can realize the benefits of data driven decision making here are a few key skillsets you’ll want to make sure are part of your workforce for 2014: Business Acumen The ability to tie HR data and study results to core organizational performance metrics will be critical for successful implementation of HR analytics.  While it’s one thing to design a research study, gather data, and analyze the results, making those results compelling by linking them to key performance indicators that are of interest to senior executives is a distinct activity that has not typically been part HR’s area of expertise. Because introducing analytics to the HR organization will require investment (and as such, tradeoff decisions) HR leaders will be required to demonstrate the return on those investments with solid links to business outcomes that are of interest to leaders across the organization. Research / Hypothesis Design Effectively using analytics to drive decision making requires a carefully formulated question and the design of a data collection and analysis strategy that will yield actionable information. HR practitioners need to understand how to design research studies to explain events within their organization. Knowledge about effective statistical sampling techniques and what type of analyses will provide the right view of the data at hand will ensure that the right data is collected  from the right subset of your workforce in order to obtain the information needed by HR and leaders in other functional areas. Statistical Analysis Along with knowing how to design a study so that the right data is collected, it is imperative that HR practitioners develop strong capabilities in statistical analysis using tools such as MS Excel, SPSS, or MATLAB.  The ability to calculate and appropriately interpret key statistical metrics like measures of central tendency, as well as more advanced analyses such as correlations, t-tests,...

read more