conquering the domain
Surrounded by domain experts and VMware veterans I had the opportunity to delve deep into the domain and understand the guts of our platform and its evolution. My approach to designing software systems is to learn as much as possible about the discipline so that I am making informed decisions about the data during the design phase. CloudPhysics offers a successful cloud-based SaaS application that provides valuable insights to admins about their virtual infrastructure. The key is that CloudPhysics collects large volumes of data and performs complicated transformations to derive metrics that can help an admin drive critical decisions for their datacenter. The time to install and derive meaningful information is negligible which makes the company very attractive to the IT world. For any product designer, the obvious step is to experiment with and explore unfamiliar software so they can learn about workflows, interactions, draw parallels with systems they have worked with and figure out the overall system map. As part of this investigative phase, we are able to quickly determine areas that are in need of restructuring, visual upgrade or complete overhaul. While design upgrades an essential part of enhancing the experience for a tool, in the enterprise world the designer isn't necessarily expected to master the subject matter. Enterprise software designers often operate on portions of a software or operate in a silo taking requirements and direction from their product and engineer teams. At CloudPhysics, I set out to disrupt this notion and define the true role of a designer. I took the time to learn about fundamental concepts in Virtual Computing from understanding CPU/Memory resource allocation and usage to Capacity planning, from storage efficiency to performance optimization. This brought me closer to the use cases that are top priority for a virtual datacenter admin. I was a sponge and continued learning from VMware knowledge base articles, blogs by well known industry leaders and often pouring over documentation to verify my understanding of core concepts. It's not uncommon for there to be ambiguity about certain concepts even among experts when the information is encyclopedic and complex in nature. For my own learning and with the goal of providing a visual reference to the company, I created diagrams and flows that capture a comprehensive picture.
Coincidentally, working at CloudPhysics took me back to my days as a circuit design engineer when I was deeply embedded in the world of computer hardware. At CloudPhysics I've been fortunate to reach into past experience that helped me ramp up rapidly. Not only did I have chip design experience to draw from but I was able to leverage learnings from designing visualizations for business intelligence tools and working on the design for a network monitoring software. I felt technically equipped to take on design for the CloudPhysics platform.
Real Data is virtuous
I cannot stress the need for understanding real data in delivering successful design. Knowing the data can help reduce the number of times you have to reach out to customers to understand what they need. At CloudPhysics with all the resources available to me, real data became my primary weapon. I went to great extents to understand the meta data we collect, transformations performed on data, nuances and challenges of delivering data quickly and various constraints of collecting and presenting vast amounts of information without degrading the message or misrepresenting the problem. I had to understand degrees of relevance for data; how a collection of metrics can unite to serve a range of use cases. By getting intimately familiar with the information required to convey the story, I was able to make decisions about what metrics should be presented as secondary or supporting information and how to incorporate relevant information in limited screen real estate while remaining clear and direct. In the end, we designed a data dense view that looked good but also completed the story. We were determined to present a 30-day time series that would plot data at 20 sec intervals. To understand the bounds and outliers of the data and to select the correct visualization for each dataset, I first plotted several samples in tools such as datahero and excel. This helped immensely in making decisions about appropriating space (pixels) and deciding on pan and zoom controls for these non-aggregated time series plots. Instead randomly drawing a time series in Illustrator using the pentool, I used the plots generated from real data in my design. Real data also goes a long way in identifying the best visualization for a dataset. My proudest achievement at CloudPhysics has been studying and experimenting with Tufte Sparklines and using them in our dashboard application. They aren't just pretty, they convey a story and one that supports the goal of the content we are displaying very emphatically. On the surface it may seem really simple to apply sparklines to suitable datasets but when you understand the data you can actually assess the efficacy of these mini visualization in a data grid. Are they going to be effective for comparison, are they conveying the trend and supporting the primary metric? These are all questions I evaluated. In the end, my design has been able to withstand the diverse datasets from our large customer base.
Sparklines in the grid were first created in jquery using "realistic" data. I placed screenshots of these sparklines in the design. Using the sparkline generation tool, I was able to experiment with the height and width. This sparkline plots 30 days of data that is bounded (0 - 24).
Plots created using real data (tools: DataHero, Excel) to make decisions around chart dimensions, pixel density, pan and zoom on a non-aggregated and non-rolled up dataset
There was initial skepticism around the density of information in our widgets. Some of this came from comparing our dashboard to ones frequently seen in business intelligence applications or other monitoring platforms. Most popular dashboards are a collection of tiles of varying sizes organized in a fluid grid layout. Most commonly the tiles present a single large number, a data grid to rank/list top items, or simple charts to represent distribution or history. Depending on the underlying charting framework, some dashboards offer additional visualization options such as scatter plots. This pattern of dashboard design is rampant and often perceived as "very clean" and "simple". The focus is on establishing interface elements that can support 2 to 3 patterns of design. The designers focus on the aesthetic and visual aspects and often use dummy content to compose the design. While this approach supports scalability well but with one major caveat: it constrains the data. Data is relegated to being a second class citizen. Given the decoupling of the actual data from the container or the framework that hosts this information, the focus is on picking data that will fit the structure. In our effort at CloudPhysics, while endeavoring to create a scalable dashboard, we too adopted the approach of tiles or widgets but the main driver for this choice was customizability to allow multiplication of content. In our design, the widgets present rich information based on the content we designed. With every widget, we honed the design to extract global or common elements to establish some consistency in the experience. We continuously adjusted the design of the containers and common elements always prioritizing content i.e. data. Our customers responded to our dashboard with great satisfaction at the issues we were highlighting and delight at the flexibility and experience of our design. They were impressed by depth and coverage of information. Our dashboard is still evolving but our first release makes a bold statement about its mission:
"We are not afraid to take a stand because we believe in our data to deliver conclusive evidence and complete information. Our information is useful and actionable."