Culture and Cognition in DevOps with Alchemist Accelerator’s Rachel Chalmers
Sam invites Rachel Chalmers, an investor, advisor, and technology industry analyst for over 20 years, for a candid conversation about the DevOps culture of shared purpose and blamelessness. Sam and Rachel explore how process, trust, and care for each other creates more innovation and gives us the opportunity to change and grow as human beings.
Partner at Alchemist Accelerator
Rachel Chalmers: This culture of shared purpose and blamelessness, this sort of dedication to finding the truth and disinterest in holding people accountable for what are very human errors in the vast majority of cases. There's a quote by a South American economist, Fernando Flores, that I really love. "Work is the making and keeping of promises."
Sam Ramji: Welcome to Open||Source||Data. Today, I'm going to interview Rachel Chalmers. Rachel is awesome. She's been an investor, advisor, and technology industry analyst for over 20 years. I first got to meet her when she was at 451 Research, as she was building out the infrastructure practice there. She was VP of Infrastructure and she was one of the first industry analysts to cover Cloudera, Splunk, and VMware. After that, I ran across her again, when she was a founding board member of the Ada Initiative. Some smart listeners may remember that Ada Lovelace was actually the first programmer in history programming Charles Babbage's difference engine. So, we kind of draw the Ada Initiatives name from that. And I had the privilege of being one of the first hundred financial supporters of that social justice organization, which has done amazing work. She's also been a board observer and advisor to 10 companies like Docker, Strong Loop, Aviatrix, StreamSets a number of others, and she's a frequent speaker and author on technology and infrastructure. She's been published in New Scientist and Salon. And she's currently on the advisory boards of Compass and Honeycomb.io.
Sam Ramji: Honeycomb of course, with the inimitable mipsytipsy, as she's known on Twitter. You'll never find a lack of awesome opinions from Honeycomb and they also hired some extraordinary SREs recently from Google and other places. Doing great work on observability. So Rachel, thank you so much for taking the time to talk with us.
Rachel Chalmers: Oh, it's always such a pleasure to hang out with you, Sam. Thank you for asking me.
Sam Ramji: Likewise. And I got to hang out with you for like a year, while we were transforming Autodesk into a cloud and data company, which was just an awesome experience. And you were a very bold leader there—you were meeting the barrier of corporate investment stamps and sort of corporate culture and politics while bringing in your sort of inimitable sense of how innovation and disruption really happens, bringing what you had done in the venture capital world. So you did some really neat stuff there bringing rapid innovation mindset, as well as the actual structure of how you could do that in a large corporation. Right? How do you have a startup board that is accountable for lines of business? It was amazingly cool.
Rachel Chalmers: It was a real phase change in my career. I had been very much a small company person up until then and here I find myself doing corporate innovation, imagine my surprise.
Sam Ramji: Yeah. You moved from there to Alchemist Accelerator, which seems like the most obvious transition ever, all of a sudden, kind of bridging what you did in your career as an analyst and an investor and a board member to the Stinson Corporate. And I'd love to hear you talk a little bit more about what took you to Alchemist and what you're practicing there.
Rachel Chalmers: So for those of you who aren't familiar, Alchemist is probably the least well known of the three big Accelerators here in the Valley. And what's really special about it to me is the people and it starts with Ravi Belani, who's the founder. He's a lecturer in entrepreneurship at the engineering school at Stanford, and he's one of my two favorite bosses, Sam being the other.
Rachel Chalmers: Ravi is just incredibly lovely and great to hang out with and super smart and caring, and he's built a community around him that reflects all of those values and all of those practices. And so Alchemist is now in its 26th cohort. We have graduated 400 companies. I first got involved as a judge for the demo days and then as a mentor and coach. And Ravi and I had been talking for years about trying to figure out a full-time role for me with the Accelerator. And once I'd wound things up at Autodesk, it became apparent that a bunch of Alchemist's corporate partners were seeking to import Silicon Valley techniques and approaches and ideas and make those available to their employees. And so I've taken on that business and the head of corporate innovation for Alchemist. I'm doing a little bit of customer search and advisory work, and I'm doing some internal accelerators as well. And it's really fascinating.
Sam Ramji: It's amazing timing because unbeknownst to all of us, I think you joined right when the pandemic was starting. And what we've all seen is that digital acceleration has gone to ludicrous speed, right? Everybody's turned plaid. Hopefully there's some Spaceballs fans in the audience, but the buzzword or the watch word was we've done two years worth of digital innovation and transformation in a couple of weeks or a couple of months, right? Just the acceleration factor has been spectacular. I'm curious to know what you've seen in that because of the conversations you're in and the techniques you're bringing. I'm assuming that the timing has ended up being accidentally perfect.
Rachel Chalmers: It has. We definitely did the digital transformation on the hard mode. I joined just as we were about to kick off one of our residential accelerators, which had been intended to run in Munich over the summer. We switched to all virtual, I think three weeks before bootcamp. We also threw an added curve ball in because that was right around the time that the Zoom security vulnerabilities were becoming evident. So our client switched from Zoom to Teams and we all had to learn Teams and figure out how to collaborate there instead of in Zoom and Slack. Massive kudos to the participants who just rolled with all of these changes. They figured out ways to define and accelerate their business models as these internal projects, reaching out to customers, doing customer interviews, having epiphanies, while dealing with suddenly having to work really as a distributed team. And some of these teams were distributed from Germany to Brazil or Israel or China. So they met the timezone challenges and they conquered them.
Sam Ramji: One of the things that I'm really curious about is, I think of you as a peak innovator in DevOps. And the reason I say that is not just because of the observations you've done, you brought us DevOps as an executive management style when we worked together at Autodesk. So when I look at the work you've done in security, AI machine learning, I'm curious, what have you seen in the last 10 years around the DevOps landscape, kind of looking at what are the strengths that you've seen, and what is the potential for the industry?
Rachel Chalmers: I don't think of myself as a thought leader. I think of myself as somebody who's been very lucky with their friends. So, I met Jesse and Adam when they were just starting out when Chef was still Opscode, and I met Jez Humble at a Chef conference in fact, and we immediately hit it off and went off to the coffee bar and talked for the next four hours. So it's more that DevOps was an ocean that I was perfectly primed to jump into.
Rachel Chalmers: My deep, dark background is two degrees in English lit and thwarted desire to be an English professor. And I came into the technology industry again through friends, but fascinated by the literary culture of technology. I realized that source code is literature and that there deserves to be critical commentary around literature. And DevOps of course is more or less exactly that, it's partly about accelerating code from commit to deploy, but it's also about understanding how code works and what it is and what it does.
Rachel Chalmers: And Nicole Forsgren has done all of the math, showing the correlation between accelerating that deployment speed and having an incredibly functional and adaptable company around it. But to me that the key insight of DevOps at it's sort of ancestor cultures like the Toyota production system and site reliability engineering is this culture of shared purpose and blamelessness, this sort of dedication to finding the truth and disinterest in holding people accountable for what are very human errors in the vast majority of cases. There's a quote by a South American economist, Fernando Flores, that I really love. "Work is the making and keeping of promises." And DevOps kind of instantiates that in writing code. And as you say, and figuring out how to manage a very agile, very fast moving enterprise, we don't punish one another for getting things wrong in good faith, but we publish the services that we're able to provide to one another and then opt to subscribe to other people's services.
Rachel Chalmers: And that harnesses distributed cognition in a very, very powerful way. One of the great enablers of DevOps is version control. You know, all of this happened after we started to have big, reliable version control systems and that idea of a distributed consensus algorithm underlying all of this, is to me incredibly powerful. It's about coming from many, many points of view, having many, many different implementations and finding the best fit across all of those potential options.
Rachel Chalmers: And after 10, 15 years of working in these modes, you're starting to see these great companies come out, which have baked into the core of their value prop this idea that you don't punish exploration and experimentation, and I'd characterize my two faves LaunchDarkly and Honeycomb as both being like that feature flagging is a safety net for devs. It lets you test in prod in ways that mean you can roll back bad changes effortlessly. It doesn't have to be a career ending incident. Honeycomb lets you ask really open-ended questions of your entire infra.
Rachel Chalmers: I think you see it also in the movement of really great infrastructure devs like Bethany Blunt and Josh McKenty, both of whom are now doing HR companies. They've understood how profound this culture change is and they're starting to build software to support the cultural changes needed to create this productivity 10X.
Sam Ramji: That is an awesome set of insights. I mean, one of the things that you pointed out was reducing the stakes is a really powerful way to create more innovation. And if you can always roll back to something quite recent, then you don't feel like you have made a career ending mistake, which lets your amygdala kind of let go. You can explore more. Josh McKenty is super interesting. Having founded Piston Computing off of the work on OpenStack at NASA and at EQ Labs, we actually have the privilege at DataStax working with Josh to help us transform how we do diversity and inclusion as a core cultural tenant with these kinds of micro lessons. Sort of continuous improvement through the hints that their system provides through affordances like Slack and email and calendaring, which is pretty amazing.
Rachel Chalmers: Yeah. I mean we're in the amazing position of sitting on 50 years of incredibly detailed and imaginative work in behavioral psychology. We understand more about who we are and how we learn now than at any time before in history and distilling all of that unbelievable academic research into these products that can just gently nudge you in a slightly better direction. We know from experience that those nudges accumulate over time into very large changes.
Sam Ramji: Yeah. And you mentioned almost two bookends of that class of research, one with Fernando Flores, who was also one of the progenitors of speech act theory and ontological design. The sense that I've had, which is that software has codified thought, if you start with Flores and realize that ontological design is the way that we fluidly and dynamically structure our thoughts in language together, such that we can take action together. That's a pretty neat basis for how we build software. And then Dr. Forsgren, I had the good fortune to meet her a few years ago when Brian Kirschner and I were at Google, and we were so struck by her and her team's work right with Gene Kim and Jez Humble. We made them an offer they couldn't refuse and acquired their company at Google when I was still there. And she has now gone to GitHub where she's leading some transformative work around team performance as automated and instrumented through DevOps infrastructure.
Rachel Chalmers: It all comes back to version control. I do want to sound a warning note though. I want to talk about AI and ML. Software is as you say, codified thought. It's the distillation of a set of institutional best practices, and it is beyond naive to imagine that software doesn't reflect the biases of its creator. I love to joke that Matt Mackall, the creator of Mercurial and Linus Torvalds creator of Git, went as far as naming their version control software after themselves. We talk a lot about scarcity and abundance mindset when we're talking about psychological safety. The fear of losing something very precious to us is a very negative impulse, and when that fear of losing privilege, fear of losing advantage gets baked into software as it does when you see these face recognition technologies that don't even recognize black faces as faces because they were only tested on white people. That's a really clear, it's a telltale, it's a giveaway that we hug our tribe close and we see the other as the other. You can see my humanities background coming back in again.
Rachel Chalmers: When we do that, we're losing more than half of the potential ingenious solution to problems that we desperately need, where we're just leaving stuff on the table. And this is the dynamic you see over and over again in the software industry. It's so heartbreaking. The engineer at Uber, Joseph Thomas, who tragically took his own life. We talk a lot about pipeline problems. We talk a lot about hiring for diversity, but when we get people from underrepresented backgrounds into our industry, their experiences is not the same as white people's experience and it can be terrible and it can be unbelievably destructive.
Sam Ramji: Well, in a way that's kind of because we've moved from a logic centric industry to a data centric industry. And let me say what I mean by that. There was kind of a time when we were doing procedural programming and the biases that we put in the software were pretty explicit, right? We made this choice, we used this language, we did it in this way. You could retrace it through the design documents. You could trace it through the requirements. There was a lot outside of the world of logic that we were all blind to.
Rachel Chalmers: Yes.
Sam Ramji: Now that we're moved to a data-driven and a data-centric industry, the sources of data, the way that that flows, those are all demonstrating this enormous quantity of implicit information and implicit bias that we weren't even aware of. Now that we're becoming aware of it, I think many of us are shocked or horrified or to use kind of typical language, but to other people who've been on the other end of the discrimination spectrum, finally the data in our systems and the implicit bias matches up with the experiences that they've had and they've been talking about for a long time. So perhaps we're at a new boundary where believability, because you can replay and detect the implicit bias in the data, might be a note of change for us. We can build a better society because we can now see what was previously invisible.
Rachel Chalmers: Yeah. And as a white person, I wanted to talk to specifically your white listeners and say, even though I was the one who raised this dark side of technology, I do think there's hope. And I think there's hope in exactly what we're talking about. Continuous improvement, nudges. I love blamelessness as a central pillar of DevOps culture. I love the idea that you can identify a problem, sit down around the table, throw out bad ideas, nobody criticizes anybody else's ideas, and keep talking until you arrive at, if not the truth, at least a version of consensual reality that's closer to everyone's shared experience than what we started with. I think in some ways it's really exciting to be a privileged white person at this point in time, because we have access to books and podcasts and films made by people whose experience is incredibly different from ours.
Rachel Chalmers: For those of us in the tech industry who got here because we love learning and we want to keep learning all of our lives, there is so much material that we just could not have access to 50 years ago, that we can immerse ourselves in and have the experience of living other people's lives and knowing what it's like to live inside another skin. That's an incredible gift. And I think the work of sitting with our discomfort and understanding the limitations of the white supremacist systems that we've built is balanced by this opportunity to change and to grow as human beings.
Sam Ramji: I'm curious to see how that lands in ML, right? One of the challenges in ML is the bespoke nature and the lack of repeatability that you have in ML systems. There's so much coming out of the scientific computing, cultural, common sense where there's not a lot of reproducibility. There's a lot of copying of data sets. There's a little bit of sharing code, models aren't really shared, and that looks a lot like how we used to build software many, many years ago. I wasn't writing software in the seventies. I was alive, but not writing software. But from people I've talked with in the seventies and early eighties where we are in ML feels like it's back then. Right? Sort of pre-industrial, not large-scale production oriented thinking, but DevOps is this sort of vision from the future, right? It's 30 years of maturity in software development. What of that do you think can be brought faster than 30 years into machine learning?
Rachel Chalmers: DevOps is certainly the future that we'd like to have. I think it's here, but not evenly distributed yet. And I think the world that you're describing in machine learning comes from two places. One is the fact that when stuff is earlier, we haven't really figured out great best practices yet. Everything is handmade. It's like doing a startup. The first 10 sales are founder's sales, you've just got to grind. You've got to get on the phone, you've got to hit LinkedIn. You've got to do the work. ML is still at the point where we're building the building blocks. It's very crude and raw. The other less reassuring thing that you're seeing happening is this constant desire that comes from a place of scarcity to monopolize. We talk about limited data sets. We talk about black box systems. We don't really understand how they work.
Rachel Chalmers: There's a real resistance to sharing the implementation details or the data sets because the way our venture industry is structured, everybody wants to have a moat. Everybody wants to have a barrier to entry. Everybody wants to have a competitive advantage. And I think that's incredibly destructive to progress. I think one of the things I really enjoy about the open source end of the industry is how generous everybody is in sharing ideas and methodologies and approaches. And of course, more traditional venture investors dislike it for exactly the same reason. It's much harder to build a barrier to entry in an open source community, as you well know. So there's always that push and pull with humans, what's mine and what's shared. I'm very clearly temperamentally on the sharing side of the line.
Sam Ramji: Well, we know from what happened between the telecommunications companies in the 2000s and the emergence of the iPhone and what the 2010s looked like, is the walled gardens tend to collapse. So that's part of the inspiration behind this whole podcast and calling it Open||Source||Data is the sense the data is so powerful, so valuable, so important, and so little understood in how we can manage, operate it, and share it that the inspiration of what we see in Open Source, what we see in DevOps, we think it's coming for data. So I'm really curious to understand, what's your vision? As we look at these different trends and themes of machine learning of development of DevOps, what's your vision of an ideal future for developers in enterprises, or developers in startups?
Rachel Chalmers: Okay, I'm going to get into trouble here if I haven't already. So a corporation is a group of people who have joined together to indulge in the fiction of legal personhood in order to hedge risk. And there's a bunch of really good reasons to do that: insurance and shared exposure and limited liability, but when it falls prey to the scarcity mindset, you end up creating billionaires and janitors who are working for minimum wage. So my personal vision of an ideal future for everyone in enterprises is more equitable distribution of the resources that are created from shared risk taking and shared labor. I'd also really like to retire the word meritocracy. I genuinely don't believe that there are smart people and dumb people. I think there are people in an environment which plays to their strengths and people who are dealing with an enormous external cognitive load, which may be invisible, but which they're probably dealing with incredibly ingeniously and in ways that we don't even understand.
Rachel Chalmers: The myth of meritocracy is used to uphold really insidious kinds of inequality in our industry, which contribute to the damaging consequences of the mistakes that we make in our code.
Sam Ramji: It's interesting to look at some of the things that we're seeing in AI tooling, as ways to flatten those barriers, right? And I start to think about AI more as augmented intelligence than as artificial intelligence, right?
Rachel Chalmers: Yeah.
Sam Ramji: I have a close friend who's built a company that focuses on natural language processing, not to interpret your intent, but to write an SQL query, or a smart QL query so that you can take native intelligence that perfectly smart people have and marry that with the ability to produce clean code, which has not been something that they've been trained in or represent their advantages. And similarly, I've run across a trend here of auto ML or auto AI, which will let a developer or a business analyst, somebody who's not a trained data scientist, and doesn't have all those capabilities, look at data and say, I think there might be a feature here. Let me kind of establish a few boundaries, some dimensional parameters, and then let me run the auto ML on it. And yes, it'll be rough and ready. It'll be cheap and cheerful, but it may give me a sense of whether that feature is valuable. And then I've become smarter, I've prototyped that capability, and then I may hand that off to this highly trained data scientist.
Sam Ramji: So this ability to start to soften those barriers, by putting more of the programming logic and inference into software, into augmented intelligence tooling that can help people, I think starts to move away from the meritocracy mindset and more towards what you've described as DevOps, this sort of mitigated risk through process trust and care for each other as human beings to figure out how do we innovate together?
Rachel Chalmers: I think that's exactly right. And the most concrete example of that kind of augmented intelligence that everybody has probably used, is Google translate. When you think of how much more information you have access to with an incredibly powerful translation tool at your fingertips, this was one of the real joyous surprises for me at Autodesk was the internationalization, the localization function that Autodesk is built in totally. Jennifer Johnson who built it is just a rockstar. But the idea that one of the big obstacles that this huge auto company had to overcome was just that builders in different countries tend not to speak English. And so it made sense to build a big internationalization software business within Autodesk was really exciting and cool, and I loved working on that.
Sam Ramji: They use some pretty cool AI techniques to be able to make that mathematically scalable as well, to speak all the different languages and use the different jargon.
Rachel Chalmers: Exactly. Before we leave this topic, though, I did want to give a big shout out to the people in the industry who've been doing that from the other side all along, the technical writers. And they tend to get overlooked because this is another pink collar ghetto, where we're a lot of women and a lot of folks with humanities backgrounds end up, but the importance of documentation and onboarding as force multipliers for people who are new and coming into a project cannot be overstated. The more work and effort and care is put into those very first tutorials, those very first language exercises in any open source community, the stronger the community. You could see that playing out in real time.
Sam Ramji: It's so great that you brought that up. We had a wonderful conversation with Patricia Boswell, who's a staff technical writer at Google, and we kind of have a perspective that next generation open source and maybe the definition of cloud-native implicitly includes great documentation because as we've gone from scarcity in software to abundance in software, what becomes scarce is human attention. And so if you're going to differentiate yourself and break out in adoption, you're going to have awesome documentation that takes the perspective of the user in mind, understands where they are in the journey, and kind of steps them up the stairway of interest and competence so that they can care enough about your software to use it well. I could not agree more with your statement.
Sam Ramji: What's one thing you wish you knew before you started your career? If you can remember donkey's years ago, or maybe to advise a version of yourself, who's maybe, Rachel in her twenties, what's the one thing that you might advise that person?
Rachel Chalmers: So this is kind of a hard question to answer because I've made a ton of mistakes and, and taken a lot of wrong turns and I kind of don't regret any of them because I've ended up in this really interesting place with this really awesome job and still tons more to learn. Like the traditional answer to that question is, I wish I'd learned more math., I wish I'd studied more organizational psychology, but the great thing about being me right now, even in the middle of this pandemic, maybe especially in the middle of this pandemic, is that I can learn math. And I am learning organizational psychology and I'm reading The Fifth Risk, and I'm looking at the 18F Methods page and reading the Rust Code of Conduct, and I feel like a monk in a library in Florence with all of these incredible texts from the library of Alexandria are at my disposal.
Rachel Chalmers: So, if I could talk to that person then, she was very young and anxious and worried whether she would find her place in the world. I would say it's going to be so much more challenging and so much more daunting and so much more exciting and rewarding than you can possibly imagine. Just keep following your nose.
Sam Ramji: I think that's helpful because sometimes the life is going to be the life, but being able to suffer a little bit less with anxiety would be a huge gift. So what one code resource, just sort of last question here, if you had to pick one code resource to point somebody who's a developer today, working on the kinds of problems that we're talking about, what would you point them to?
Rachel Chalmers: So it's not exactly a code resource. I cheated a little bit here. It's 18F Methods, which I just shouted out to. 18F is the GSA division that Obama built after the healthcare.gov fiasco on Oracle infrastructure. 18F was designed to do what I'm doing now, which is to codify the best new thinking about how to build products in Silicon Valley and ship that back to the mothership. And their methods page is huge, it's beautifully written. It embodies the exact customer centricity that it's teaching. It explicates everything that Michael Lewis is talking about in The Fifth Risk about government being the ideal collective exercise in hedging risk. It's just an incredibly rich resource. It's very, very granular with methods for doing customer interviews and checking in with features and product design. I can't get over how good it is. It's really a wonderful thing for us to have at our disposal.
Sam Ramji: That's awesome. They've been leading for such a long time. I remember getting a chance to work with a few of those folks when I was running Cloud Foundry. Just to give you a sense of how awesome they were back in 2015, over the Christmas holiday, one of the developers at 18F as I recall, Diego Lapiduz realize that they needed a more structured way to put applications in production. And he built out a Cloud Foundry of infrastructure and contacted us about it a few weeks later and said, "This is working really well." We were like, "You are absolutely not what we expected from the government." And that was my first realization that incredibly thoughtful, capable people were putting their best effort into 18F. I'm really thrilled to hear from you that it's going stronger even today than it was back then.
Rachel Chalmers: You know, it's like working with Star Trek Federation.
Sam Ramji: That's awesome.
Sam Ramji: So, Rachel, thank you so much for the conversation. Thank you for your generosity, your time and spirit. I really appreciate it.
Rachel Chalmers: It's been wonderful to catch up, Sam.
Kathryn Erickson: Hi, this is Kathryn Erickson. I work on the strategy team here at DataStax and I'm here with Alice Lottini, one of our principal architects. We both really enjoyed Rachel's episode with Sam.
Kathryn Erickson: There are a few things that stood out to me. Rachel said at one point with good friends, that it turned into opportunities and it stood out to me throughout the podcast how confidently she spoke about her interest and how her interests have guided the opportunities that she's taken. How she spoke to the one person at the conference, and it turns into a four-hour conversation, which turns into somebody becoming kind of an integral figure in her career. You try not to think of missed opportunities, and it's different to think of every interaction as kind of an opportunity. I thought that was an optimistic outlook for every interaction that we have.
Alice Lottini: Yeah, absolutely. And I think that the fact of keeping an open mind, she said that she would tell her younger self to keep following her nose. If we have a set idea of where our career is, and what box we fall in, of course you will have an idea of your direction and aspiration to get to a certain point. But I think that leaving these possibilities open and letting yourself be interested in maybe taking a different direction because something comes up that you would never have thought about is very important. I think it can give people unexpected turns that then become maybe the best thing that they've ever tried. And if something doesn't work out or you don't like it, fail fast and realize you've learned that it's not your way. And so something else would be better.
Alice Lottini: And this kind of ties into the point that she made about meritocracy—that she sees as an outdated word that should not exist anymore because she was saying that only people who are not in their continual environment versus people who are. So in the end, it's really about finding your place and finding your place can take some time to experiment. But I think having an open mindset on that is really important because I think when we start our career, we don't really know how things will work out and things change around us so much that even if you had an idea of where you were going, maybe something else has come up that is even better.
Kathryn Erickson: Yeah. Especially through the pandemic, the way that she said that about not using the word "meritocracy" anymore, she said, "We don't know what each person is going through." And I just thought that it just felt like a very humanizing way to look at all the different positions in the company and that with each person finding the position that is right for them and still dealing with a lot of things that are going on in the world and trying to be the best at what we are and the ability just to drop judgment and blame and just kind of do your best was pretty profound. I love how she used the exact words that she meant in that comment, where she said to live inside another skin, that these experiences that we have, the literature, the environments that we put ourselves in, let us grow as people and have that experience.
Alice Lottini: That would be another point that I really found insightful that was about the bias indices built into the data. It used to be built into the logic and more recently now, it's implicitly built into the data. And that was a really good point as well. We used to have a logic centric system and now we're more towards data-centric systems. And we still have a thread of bias that is built into what we do, but in the past it was more explicit and somehow coded into what we wrote. Whereas now, because we're informing systems with the data, how are we choosing this data? What does this data represent to you? How are we training the algorithms that we create? So this is really important to bear in mind when we think about building something that is truly for everyone.
Narrator: Thank you so much for tuning in to today's episode of the open source data podcast, hosted by DataStax's is chief strategy. Officer Sam Ramji, we're privileged and excited to feature many more guests who will share their perspectives on the future of software, so please stay tuned. If you haven't already done so, subscribe to this series to be notified when a new conversation is released and feel free to drop us any questions or feedback firstname.lastname@example.org.