InfoAccess Migration to Badger Analytics

Our Updated Approach: Migrating Logical Groupings

Badger Analytics, UW-Madison’s data warehouse using the platform Snowflake, officially went live in March 2021. Lessons we have learned in the months following that release have prompted a change in our approach to migrating all data into Badger Analytics so that InfoAccess may be retired. We will be migrating the data from InfoAccess into Badger Analytics through sprints based on logical groupings based upon the number of users, the desire/need to enhance the content, and the change impact. This will allow us to minimize the duplication of efforts to maintain data in both InfoAccess and Badger Analytics, as well as get users up and running with Badger Analytics sooner.


InfoAccess migration sprints with data groupings



As with any project, changes will need to be made along the way and delays may occur. This represents our best estimate of the duration, timing, and order of the sprints.

Badger Analytics migration project timeline as of May 2022

Local Implementation Managers

To facilitate the migration of current InfoAccess users into Badger Analytics, ODMAS has partnered with a LIM from each division. The LIMs are responsible for assessing the use cases for their division, evaluating users’ technical skillsets, identifying needed training and support, and serving as a single point of contact between their unit’s Badger Analytics users and ODMAS.

Division Implementation Manager
APIR – General Education Admin Clare Huhn
Business Services Lea Erickson
College of Agriculture and Life Sciences Ann Bourque
College of Engineering Sara Hagen
Division of Continuing Studies Alan Ng
Division of Student Life Todd Schwanke
Division for Teaching and Learning Jeff Shokler
Enrollment Management TBD
Facilities Planning & Management David Gerber
Libraries Steve Meyer
General Services Clare Huhn (DDEEA)
Jennifer Klippel (MBO)
Jon Vander Hill (OHR)
Information Technology Mark Field
Intercollegiate Athletics Jim Ekenberg
International Division Csanád Siklós
Letters and Science Anne Gunther
Law School Eric Giefer
Nelson Institute of Environmental Study Tara Mohan
OVC for Research and Graduate Education Ryan Pingel (RSP)
Peter Kinsley (Graduate School)
School of Business Nate Kelty
School of Education Sara Alva Lizarraga
School of Human Ecology Jonathan Henkel
School of Nursing Nikki Lemmon
School of Pharmacy Mike Pitterle
School of Veterinary Medicine Clifford W. Bass
SMPH Elizabeth Simcock
UHS Courtney Blomme
University Housing Dave Swiderski
Wisconsin Union Jeff White

InfoAccess Integrations Migration

We are aware that a large number of campus delivered and custom applications over time were built with InfoAccess as their data source. Since Badger Analytics/Snowflake serves only analysis and reporting use cases, downstream InfoAccess integrations will not be permitted in the Snowflake environment; instead, they will need to be migrated to the new campus Interop infrastructure for system-to-system integrations. We are working directly with DoIT teams in WaMS and AIMS to inventory campus systems with InfoAccess data integration dependencies to develop a migration plan, but also want to hear from our local implementation managers about additional divisional applications not covered under the DoIT umbrella. Once identified, we will work with the divisional application developers to schedule the migration work as Interop delivers the common APIs.

Badger Analytics Onboarding

We are currently in the process of reviewing all InfoAccess user accounts. Working with a local implementation manager (LIM) from each unit or division, each account will be evaluated to ensure it meets the criteria for conversion to a Badger Analytics account. The LIM will help determine that account is held by a person whose job responsibilities still require data access, the usage complies with the Institutional Data Policy as well as our security model, the user has the needed skills for using Snowflake, and that the use case is most appropriate for Badger Analytics (as opposed to another solution, such as a Tableau workbook).

Users will be onboarded during the earliest sprint in which all the content area(s) they use are migrated. The needed training, certification, and user support will be provided at that time.

At this time, only a small group of users has access to Badger Analytics, as only a few content areas (DARS, Profit & Loss Report, Lumen, and Academic Structure) are currently in Badger Analytics. We anticipate the first large group of users using student data will be onboarded when the enrollment content is migrated in early-to-mid calendar 2022.

Badger Analytics User Training

The Badger Analytics user training course is currently being developed in Canvas. However, those who are ready to get started may begin by completing the following:

SQL Baseline

Completion of either of these two courses on LinkedIn Learning should be sufficient to demonstrate SQL proficiency. Either a screenshot of the certificate from either course or passing score on the final exam from the first course will serve as satisfactory proof of completion. If you feel that your SQL knowledge is sufficient, you may take the final exam in the first course without reviewing the material; a passing score on that exam is sufficient proof of a baseline understanding of SQL.

  1. Learning SQL Programming
  2. SQL: Data Reporting and Analysis

Introduction to Snowflake

The following courses provided by Snowflake should serve as a good introduction to the Snowflake environment and interface:

  1. LVLUP-101, Level Up: First Concepts
  2. LVLUP-102, Snowflake Key Concepts
  3. LVLUP-103, Level Up: Snowflake Ecosystem
  4. LVLUP-105, Container Hierarchy
  5. LVLUP-201, Query History and Caching
  6. LVLUP-202, Context

Additionally, the following YouTube videos provide additional information about the Snowflake environment:

  1. Getting Started – Introduction to Snowflake
    1. Documentation
  2. Getting Started – Architecture and Key Concepts
  3. Accelerating BI Queries with caching in Snowflake

Case Study

Developers are required to complete a case study on a Badger Analytics/Snowflake data set to demonstrate proficiency in both SQL and the Snowflake environment. If you have completed the above SQL and Snowflake requirements and you are ready to begin the case study, please reach out to us by email to get started.


This is an accordion element with a series of buttons that open and close related content panels.

I am a current InfoAccess user. How do I get access to Badger Analytics?

Our approach to onboarding users into Badger Analytics is to first determine that current InfoAccess users have the skills and appropriate use cases for using Badger Analytics effectively. All current InfoAccess accounts are being evaluated by a local implementation manager representative from each unit or division. Once we have determined a Badger Analytics account is appropriate, the user will be onboarded during the earliest sprint for which their needed data is being migrated. Users may expect to be contacted about the next steps for onboarding and training in Badger Analytics during that sprint. If it is determined by ODMAS and a location implementation manager that Badger Analytics is not the best solution for a particular user or use case, we will work with that user to find the best fit for their needs.

I run Hyperion Interactive Reporting (IR) queries against the InfoAccess views, but don’t know how to migrate them to Badger Analytics. What action do I need to take and when?

There are specific readiness actions that you can take now to prepare for the migration to Badger Analytics which will be implemented over the next 18 months:

  • Inventory the .bqy files you have saved in Hyperion IR. Are they all still active and in use? If so, you should document what is current and of value: file name, business purpose, audience, timing (e.g., run annually, quarterly, daily, ad hoc), what InfoAccess views and data elements are used, and any pain points associated with using it. For example, a need to join data from multiple campus data warehouses (e.g., EPM, InfoAccess, WISER/SFMRT, etc.), external sources, file extracts, etc.).
  • Compare the campus dashboards available in RADAR to your inventory. Are there any existing reports that you can use right now? That means a conversion that you don’t have to do! Any existing reports that might work with some enhancements? Make a note to discuss your requirements with your LIM so that we can track the requirements and work with the Tableau developer to incorporate them now or in future sprints.
  • If you and your LIM agree that you will be on a path to Badger Analytics access, you can begin to familiarize yourself with the free SAS Enterprise Guide query tool. Follow the installation instructions available on the Knowledge Base, and practice migrating a few queries. Consider reaching out to peer SAS enthusiasts or ODMAS if you get stuck. Once prospective Badger Analytics users are identified by their LIM, you will be added to the monthly Badger Analytics User Group, where you will get to collaborate with peers on different Snowflake and data architecture training and certification opportunities as we progress through the project sprints.
  • Once the new content areas are available in Badger Analytics, you will be able to begin transitioning your inventory of .bqy files to the right BI solution (e.g., Snowsight, Tableau, SAS, R/Python, Stata, etc.) and will be able to participate in group “BAckathons” to exchange ideas and best practices with other users and ODMAS experts in the same room.

I am not a current InfoAccess user, but I would like access to Badger Analytics. What should I do?

Please reach out to your LIM (see table above). They will help determine if Badger Analytics, Tableau, or another solution is best for your needs. If they determine Badger Analytics to be the best fit, ODMAS will add you to the appropriate onboarding group.

I have data/reporting needs, and I am unsure about how to best meet them. What should I do?

Please reach out to us at, and we will get you started on the right path!

What are the benefits of this project to people who need to do reporting and other data analysis?

A main goal of the project is to create a unified base for analytics by consolidating and cross-integrating strategic mission data assets into a unified cloud data platform managed by Snowflake. The current process of pulling in data is very labor-intensive for the analyst. Duplicate views make it hard to know which views to use, lineage isn’t clear, and the data structures were built for specific purposes, which other users are now trying to use for different purposes. InfoAccess was a data warehouse created without the whole in mind, and alternative use cases of existing queries is risky from a security, quality, and accuracy standpoint.

What are the risks to maintaining the status quo? What about people who feel the current state meets their needs?

In addition to the problems resulting when a single source of truth doesn’t exist, there are other risks: 

  • Security vulnerabilities exist in some of the older/unsupported query tools. Migrating users to supported query tools and increasing skill sets in both data and querying ensures quality of analysis and results.  
  • Users are not protected when changes are made to the underlying data models, and therefore must be able to maintain any queries they run when changes to the underlying data structures occur. 
  • There are greater risks to data security and the greater potential for data breaches. Data security is an important and ongoing concern; moving users to access data through their own authentication to Badger Analytics (rather than access data someone has extracted and saved on a network drive) greatly reduces the risk where a data breach might occur. 

I have some long running queries. Will this help?

We are seeing 50-100% faster query responses, resulting in faster data loads and faster report responses. Queries using Snowflake aren’t going to take the hours they did in InfoAccess. The well-architected data structures allow for a high-performing database producing consistent results.

How will this help those with future reporting and data analysis needs?

Current and future users will see many benefits, including: 

  • Common script libraries are being developed to make it faster for query developers to generate necessary jobs and to address data quality issues in local queries. 
  • Standard queries for training and access will help with the onboarding of new staff. 
  • Reduced workloads: Some users can be migrated from having to create their own queries to instead being consumers of queries developed by others (and the work of the consumers of queries will be reduced to filtering data rather than building multiple queries), eliminating duplicate work currently taking place across campus. Currently, users with similar needs are reinventing similar solutions unaware that something that meets their needs may already exist. By steering people towards dashboards that will have the same results, the need for duplicated efforts is eliminated.

Why was SAS chosen as the university’s supported tool?

ODMAS consulted with end users about their use cases and performed an analysis of available business intelligence ad-hoc tools. Toad Data Point was the ad-hoc tool that was intended to replace Hyperion Interactive Reporting after vendor and UW system support ended. After conducting surveys with users, an evaluation of available tools was performed and the decision to stop supporting Toad Data Point at the conclusion of its support contract was made. ODMAS now recommends use of SAS Enterprise Guide, which was already licensed and available to the campus community. SAS is capable of doing cross-database joins which was one of the requirements for tool selection. If you are a current UW-Madison employee, please contact us by email for a full copy of the tool analysis report.

Is it permissible to use other graphic user interface (GUI)/visual tools they are comfortable using?

Many tools were tested, and users are still permitted to use other tools. Choosing a standardized tool allows us to focus our support and training on SAS specifically and help eliminate the frustration of constantly changing the tools. Please contact us by email for a full copy of the tool analysis report which walks through our process, showcases real user feedback, and documents some of the strengths and weaknesses of each. 

Skip to content