The scope of Wage and Hour cases can extend beyond traditional claims on overtime or off-the-clock work. The same analytical principles can extend, for example, to cases involving employee reimbursements. EmployStats has recently worked on a case in California where the Plaintiffs allege they were not reimbursed for routine miles traveled in personal vehicles between job sites, despite the Defendant’s stated policy.

The EmployStats team assessed the Plantiffs’ theory of liability and estimated unreimbursed expenses based off of the available case data on mileage, parking, and toll charges. The analysis presented to the court showed a significant difference between stated and actual reimbursements for miles traveled by the Plantiffs. Based off of the analysis and other evidence at trial, the court certified the Plaintiff class.

The EmployStats Wage and Hour Consulting team’s trial plan is as follows:

  1. First, the EmployStats team would survey a statistically representative sample of class members about the existence of unreimbursed miles, using a random sampling methodology to eliminate potential bias.
  2. Next, the team would use a similar statistical sampling methodology to determine the typical miles traveled by the class members, and combining this resulting data with mapping platforms (ex. Google Maps API) to calculate distances in miles traveled between job locations.
  3. Finally, Employstats would tabulate damages based off of these results, using publicly available data on reimbursement rates for miles traveled in personal vehicles.
A copy of the court’s order can be found though the link here: McLeod v Bank of America Court Order – Dwight Steward PhD Statistical Sampling Plan
To see how EmployStats can assist you with similar employment or statistics cases, please visit www.EmployStats.com or give us a call at 512-476-3711.  Follow our blog and find us on social media! @employstatsnews

The United States Census Bureau announced on September 2018 that their privacy policy regarding the 2020 Census Survey and other public use data projects will be undergoing changes, some of which could have an impact on many areas of data science.

According to a December 2018 report  written by the Institute for Social Research and Data Innovation (ISRDI), University of Minnesota, the US Census Bureau’s new set of standards and methods for disclosure, known as differential privacy, may make it impossible to access usable microdata and severely limit access to other important public use data.

Data scientists, including those at EmployStats, have been regularly utilizing free reliable public Census Bureau data to analyze social and economic factors across America for over six decades.  The US Census Bureau releases public microdata such as the Decennial Census and the American Community Survey, which collects information on demographics, employment, income, household characteristics, and other social and economic factors.  EmployStats uses this data regularly to assist clients in Labor and Employment cases.

The ISRDI report can be found here.

To find out more about how EmployStats can assist you with your Labor and Employment case, please visit www.EmployStats.com and make sure to follow us on Twitter @employstatsnews

All data projects can benefit from building a Data Management Plan (“DMP”) before the project begins.  Typically a DMP is a formal document that describes your data and what your team will do with it during and after the data project.

There is no cookie-cutter DMP that is right for every project, but in most cases the following questions should be addressed in your DMP:

  1. What kind of data will your project analyze?  What file formats and software packages will you use?  What will your data output be?  How will you collect and process the data?
  2. How will you document and organize your data?  What metadata will you collect?  What standards and formats will you use?
  3. What are your plans for data access within your team?  What are the roles that the individuals in your team will play in the data analysis process?  How will you address any privacy or ethical issues, if applicable?
  4. What are your plans for long term archiving?  What file formats will you archive the data in?  Who will be responsible for the data after the project is complete?  Where will you save the files?
  5. What outside resources do you need for your project?  How much time will the project take your team to complete and audit?  How much will it cost?

When working on any type of data project, planning ahead is a crucial step.  Before starting in on a project, it’s important to think through as many of the details as possible so you can budget enough time and resources to accomplish all of the objectives.  As a matter of fact, some organizations and government entities require a Data Management Plan (“DMP”) to be in place in all of their projects.

 

A DMP is a formal document that describes the data and what your team will do with it during and after the data project.  Many organizations and agencies require one, and each entity has specific requirements for their DMPs.

 

DMPs can be created in just a simple readme.txt file, or can be as detailed as DMPs tailored to specific disciplines using online templates such as DMPTool.org.  The DMPTool is designed to help create ready-to-use data management plans.

Doug Berg, Ph.D., is an expert in big data, and has been working with EmployStats and Principal Economist Dr. Dwight Steward for several years regarding class action and discrimination lawsuits.  Dr. Berg is currently a professor at Sam Houston State University in the Department of Economics.  He received his Bachelor’s degree in Accounting from the University of Minnesota, and his Ph.D. in Economics from Texas A&M University.  Dr. Berg will provide additional support and his expert insight into using big data in employment litigation.  Doug Berg, Ph.D., describes litigation as “living on data”, and the better the data, the better the argument.  EmployStats welcomes his insight into the underlying meaning behind the data our clients provide us!

Big data is not simply a size, it is a way of describing the type of data tools that will be utilized for an analysis.  Most, if not all, of the big data we work with at EmployStats requires specific data tools that are ever changing and evolving, as well as new tools that are being introduced into the market constantly.  

Each avenue will handle big data differently, and offer specific benefits that will determine how an analysis will be performed, as well as how results will be interpreted.  EmployStats constantly keeps up to date with the latest and greatest data analytic software for large data sets in order to optimize the outcome of these types of analyses.  

Many recent cases such as United States of America v. Abbott Laboratories and Pompliano v. Snapchat have utilized big data analysis techniques in litigation, proving that not only is it common to use big data in litigation, it is necessary to bring many cases to a successful close.

This past week, Employstats associate Matt Rigling visited Washinton D.C. for a training course led by StataCorp experts. The course was titled Using Stata Effectively: Data Management, Analysis, and Graphics Fundamentals, and was taught by instructor Bill Rising at the MicroTek Training Solutions facility, just a few blocks away from the White House.

Here at Employstats, our analysts utilize the statistical software package Stata for data management, as well as data analysis in all types of wage & hour, economic, and employment analyses. With Stata, all analyses can be reproduced and documented for publication and review.

The training course covered topics ranging from Stata’s syntax to data validation and generation, and even topics such as estimation and post-estimation. “I took away a lot of useful techniques from the Stata course, and I learned about some new features of Stata 14, such as tab auto-complete and the command to turn Stata Markup files into reproducible do-files. Most importantly, I learned data manipulation skills that will help me work more efficiently and accurately.” said associate Matt Rigling.