Postman is an API platform that simplifies each step of the API lifecycle and streamlines collaboration to help users create better APIs faster. In order to offer the best to their customers, making decisions driven by data is key.
To gain a better understanding of how Postman looks at analytics, we interviewed Prudhvi Vasa, who leads a team of data analysts and engineers at the company. Prudhvi is an analytics leader with extensive experience within various industries- including retail, e-commerce, technology, and telecommunications.
In this blog, we will learn more about Postman’s current data stack, how do they structure their data team, what is the core skill they look for while hiring and how they enable different functions to make sure everyone in the organization is aligned with everything data! Let’s dig in.
Postman’s Data Stack
At Postman, product and data is at the centre of every conversation. Understanding user behaviour on the product and offering the best experience for developers is key. As you can imagine, there is a tonne of data coming from everywhere- to use data to make decisions, Postman has invested in some of the best solutions, which includes the following-
BI & Dashboards- Postman uses Looker to build interactive dashboards to track key KPIs and metrics for key stakeholders in the company
Data Ingestion- There are a tonne of application and event data to be gathered; Postman uses an in-house developed data ingestion engine called Fulcrum and Hevo Data for 3rd party data ingestions
Data Cataloging- It’s important to have one home for data teams to map lineage, maintain a data catalog and make the best use of metadata capabilities. Postman uses Atlan to solve for this.
Data Warehouse- Postman uses Amazon Redshift as their data warehouse.
What is at the core of Postman’s data team?
As we are witnessing numerous technology disruptions in the data space, the role of a data team member is evolving at a rapid pace.
In order to keep up with the changing times, a data team member must be versatile and adaptable to new and emerging technologies. For instance, the advent of cloud computing, machine learning, and big data analytics has brought about significant changes in the way data is stored, processed, and analyzed. In addition, with the emergence of code abstraction layers (e.g ChatGPT) -the role of an analyst is evolving but what remains at core is to have natural inclination towards using data-driven insights to identify and solve problems.
This requires a deep understanding of the business and the data itself, as well as an ability to think critically and creatively. Furthermore, data teams must possess excellent communication and collaboration skills in order to effectively communicate their findings and work with other teams within the organization.If you’re looking to be a data analyst in 2023- here’s what you should know.
How Postman looks at building data products?
Postman values the core fundamentals of the software development lifecycle (SDLC) to build various data products or artefacts. This ensures that the analytical processes are well-defined and optimized for efficiency.
SDLC is a process followed by software organizations for managing software projects. It includes a detailed plan for developing, maintaining, replacing, and enhancing specific features. Similarly, to address complex data challenges, data products should follow a similar lifecycle.
Postman operates under a service model, providing data to internal consumers in order to meet their needs. This model allows Postman’s data teams to identify repeating patterns in data usage, which is then leveraged to create new products. For example, Postman developed Fulcrum, an in-house data ingestion tool, to better serve internal consumers' ETL needs. In addition, the team is dedicated to continually improving their service offerings and expanding the product line to meet the ever-evolving needs of internal data consumers.
“We are primarily in a service model serving internal consumers with their data needs - and when we see a repeating pattern, we build products (ex. Fulcrum for data ingestion).”
As a part of building on the above fundamentals and make the data function successful- Postman has observed that hiring for an Analytics Engineer has proven out to be a valuable asset for the data organization.
What does an Analytics Engineer do at Postman?
At Postman, the role of an Analytics Engineer has proven out to be successful. The primary objective of an Analytics Engineer at Postman is to ensure that the analytical code is performant. This means that the code should be optimized to run as efficiently as possible, so that the system can handle large volumes of data and complex queries without slowing down.
“Primary objective of the Analytics Engineer is to make sure-
-The code is performant
-Keep tech debt in control and improve our overall code.
-Provide guidance and training to the analysts for best practices.
-Monitors key metrics around query times, load on our warehouse, run time of our analytics jobs and more."
In addition to optimizing the code, an Analytics Engineer is responsible for keeping technical debt in control and improving the overall quality of the codebase. This requires a deep understanding of the system architecture and the ability to identify areas for improvement. By continuously improving the codebase, the Analytics Engineer helps to ensure that the system remains scalable and reliable.
A key part of the role is to guide and train analysts on best practices. This includes helping them to understand the underlying data model, designing efficient queries, and interpreting the results.
Finally, monitoring key metrics around query times, load on the warehouse, run time of analytics jobs is a key part of the role. By keeping a close eye on these metrics, the Analytics Engineer can quickly identify potential issues and take proactive steps to address them before they become a problem.
Analytical enablement to bring alignment
Enabling everyone in an organization to consistently gain insights from data is a challenging task. This requires a multifaceted effort that involves investments in technology, process improvement, basic business intelligence (BI) training, and change management. However, these initiatives are often implemented in silos, resulting in tools that are under-utilised, proof of concept graveyards, report factories, and gaps in processes.
To avoid these common problems, it's important to take a holistic approach to digital transformation when it comes to analytics programs. This means not only investing in the technology and training needed to achieve success, but also creating a culture of data-driven decision making throughout the organization. This can involve regular training sessions, data literacy programs, and cross-functional collaboration to ensure that everyone is on the same page.
In addition, it's important to recognise that digital transformation is an ongoing process, and not a one-time event. As such, it's important to regularly assess analytics programs and identify areas for improvement. This can involve gathering feedback from stakeholders, monitoring key performance indicators, and conducting regular audits of the analytics tools and processes in place.
One of the enablement programs that Postman utilises is to create use-case based templates using Confluence on ‘how to’ do particular analyses for different teams. This practice ensures that all team members are equipped with the necessary knowledge to conduct analyses and generate insights. Moreover, it fosters a sense of collaboration and knowledge-sharing, which can lead to better decision-making and more effective problem-solving.
Furthermore, Postman's investment in data solutions extends beyond just the technical tools. The company also places a strong emphasis on collaboration and generating user studies, which helps ensure that the data being generated is useful and actionable. Through these efforts, Postman is able to stay ahead of the curve when it comes to data analysis and management, and is well-positioned to continue its success in the future.
Conclusion
For many years now, companies have been making efforts to become more data-driven. However, despite these efforts, the results have been mixed. The success of such endeavours is not always immediate and takes time to play out within organizations.
What distinguishes the companies that ultimately succeed from those that continue to struggle is their persistence, resilience, execution, and relentless drive to employ data to make more informed business decisions. These companies are determined to integrate data into their decision-making processes, and to foster a culture that values data-driven insights.
This requires a multifaceted approach that involves investments in technology, process improvement, basic business intelligence (BI) training, and change management. But, most importantly, it requires individuals and teams to be fully committed to leveraging data to improve decision-making at all levels of the organization.
So, there is a lot that happens behind the scenes to make an organization truly ‘data-driven’ and Postman is a great example to follow! Postman has successfully implemented various processes to continuously improve its analytical capabilities, and has created a culture of data-driven decision-making throughout the organization.