#71 Adventures in Data Maturity - Creating Reliable, Scalable Data Processes - Interview w/ Ramdas Narayanan

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center hereRamdas' LinkedIn: https://www.linkedin.com/in/ramdasnarayanan/In this episode, Scott interviewed Ramdas Narayanan, Vice President Product Manager of Data Analytics and Insights at Bank of America. To be clear, he was not representing the company and was sharing his own views.Ramdas came on to discuss lessons learned from building effective data sharing at scale on the operational plane over the last 5-10 years so we can apply those to our data mesh implementations. A key output of the conversation is a guiding principle for getting data mesh right - your goal is to convert data into effective business outcomes. It doesn't matter how cool or not cool your platform is or anything else - drive business outcomes! It's easy to let that get lost in the tool talk and everything around data mesh.Per Ramdas, when looking at creating a data product, or really any data initiative, you need to align first on business objectives and that will drive funding. In the financial space, that is direct literal funding but even outside, you should have the same mindset. Make sure you get engagement and alignment across business partners, technologists, and subject matter experts. How are you using technology to address or solve the business problem?Ramdas has seen that if you don't focus on creating reusable data, you can create silos - you need cohesive data sets, not bespoke data sets for every challenge as that just doesn't scale. You should also study the data sources you are using - is there additional useful data you could add to your dataset or could you use that data for other purposes - keeping an eye out for additional data to drive business value will really add a lot to your organization.When working with developers, Ramdas recommends helping them understand how the business is going to consume and use the data and then figure out if they should deliver data as something like an API or web service or more of a custom batch delivery. It is important to also work with data consumption teams to be reasonable in their consumption demands - getting them to modernize can be a challenge and that can put an unreasonable burden on producing teams.Ramdas talked about how crucial conversations and culture are...

Om Podcasten

Interviews with data mesh practitioners, deep dives/how-tos, anti-patterns, panels, chats (not debates) with skeptics, "mesh musings", and so much more. Host Scott Hirleman (founder of the Data Mesh Learning Community) shares his learnings - and those of the broader data community - from over a year of deep diving into data mesh. Each episode contains a BLUF - bottom line, up front - so you can quickly absorb a few key takeaways and also decide if an episode will be useful to you - nothing worse than listening for 20+ minutes before figuring out if a podcast episode is going to be interesting and/or incremental ;) Hoping to provide quality transcripts in the future - if you want to help, please reach out! Data Mesh Radio is also looking for guests to share their experience with data mesh! Even if that experience is 'I am confused, let's chat about' some specific topic. Yes, that could be you! You can check out our guest and feedback FAQ, including how to submit your name to be a guest and how to submit feedback - including anonymously if you want - here: https://docs.google.com/document/d/1dDdb1mEhmcYqx3xYAvPuM1FZMuGiCszyY9x8X250KuQ/edit?usp=sharing Data Mesh Radio is committed to diversity and inclusion. This includes in our guests and guest hosts. If you are part of a minoritized group, please see this as an open invitation to being a guest, so please hit the link above. If you are looking for additional useful information on data mesh, we recommend the community resources from Data Mesh Learning. All are vendor independent. https://datameshlearning.com/community/ You should also follow Zhamak Dehghani (founder of the data mesh concept); she posts a lot of great things on LinkedIn and has a wonderful data mesh book through O'Reilly. Plus, she's just a nice person: https://www.linkedin.com/in/zhamak-dehghani/detail/recent-activity/shares/ Data Mesh Radio is provided as a free community resource by DataStax. If you need a database that is easy to scale - read: serverless - but also easy to develop for - many APIs including gRPC, REST, JSON, GraphQL, etc. all of which are OSS under the Stargate project - check out DataStax's AstraDB service :) Built on Apache Cassandra, AstraDB is very performant and oh yeah, is also multi-region/multi-cloud so you can focus on scaling your company, not your database. There's a free forever tier for poking around/home projects and you can also use code DAAP500 for a $500 free credit (apply under payment options): https://www.datastax.com/products/datastax-astra?utm_source=DataMeshRadio