Hello! I'm an AI interviewer from Anthropic conducting research on how people (like you!) envision AI fitting into their lives, both now and in the future. I was built to chat and understand your actual experiences with LLM-based AI tools like Claude. Yes, I'm an AI asking about AI. A bit self-referential! But that's exactly why your human perspective is valuable. This will take about 10-15 minutes. If it's accessible on your device, feel free to use voice dictation instead of typing. I'll ask about your vision for AI's role in your life, and what experiences, values, and needs shape that vision. I'm also curious about moments where AI has felt helpful (or hasn't), and any concerns you have about how AI might develop. Please don't share anything private such as health details, personal information, or identifiable information about others. Your responses will help shape how we think about AI in everyday life. Sound good?
sure!
Great! Let's dive in. To start, what's the last thing you used an AI chatbot for?
building software. i use LLMs to build scaffolding, and help me debug things when i'm stuck, as a rubber duck.
Thanks for sharing that. It sounds like AI is already playing a pretty active role in your development work. Now I'd like to explore something more aspirational: If you could wave a magic wand and AI could help you with anything in your life, what would you choose?
probably things like 3D model rigging, topology, and guidance in those fields, not modeling and texturing itself necessarily, but rather just acting as a "pair modeler" if that makes sense.
That makes sense—having AI as a collaborative partner in the technical aspects of 3D work. If AI could really nail that "pair modeler" role for you, what would that open up in your life? What's the larger hope or vision behind that?
personally it would let me be able to work on indie game projects way more often than i usually do. i can already do the other technical aspects of things fine- programming, game design, etc. but the art side of things is always hard. i have ideas and work upon these but it takes more time than i would like it to take- while i don't want to give all of this to just an AI, i do want to accelerate it and my own learning along with it, so it would be pretty helpful (there is also the whole social aspect that working with other people is often messy and hard, and while it would be great to work with someone who grasps these really well, people may not be great teachers, or more often than not they just don't have time)
That really paints a clear picture—you want to bring your game ideas to life more fully and frequently, but the art pipeline is the bottleneck. And it sounds like AI as a "pair modeler" could both speed things up and help you learn, without the coordination challenges or availability issues that come with human collaborators. We're about halfway through now. Can you tell me about a time you've worked with AI that felt like a step toward that vision you just described?
yes, i was working on a software project of mine, and i kind of used LLMs as a crutch to help me get out of these "ADHD pits", using LLM work to resolve things when I don't have the capacity to move the project forward, which resulted in me being able to finish it in a much shorter time than usual.
It sounds like the LLM helped you push through those moments where you'd normally get stuck or lose momentum, which kept the project moving forward instead of stalling out. What makes AI particularly good for that compared to other approaches you might have tried before?
it's mostly that i can strike up a conversation whenever i want. i could've asked other people, but they are not always going to be available and are already mostly busy anyway. this way, i don't have to wait for someone, and i can use LLMs to cover my "blind spots".
That availability piece really seems key—having something there exactly when you need it, without having to coordinate or wait. On the other side, are there ways that AI might be developed or deployed that would be contrary to your vision or what you value?
mostly generative AI, i really don't think we need AI to do everything, and it seems rather counterproductive- instead of completely disregarding human input, we should augment it... not to mention the ethical issues that come with this ("stealing" work. i don't think this is really a problem, as you can't "steal" this data, you copy it. but it's more about the system we live in and the people behind the data not getting any compensation or such). there is also the more subjective side of something having "soul", especially in art. while personally i think AI generated art can evoke emotions (even though this is 99.99% not the case currently, in my experience), another piece of the puzzle is the artist's own challenges getting that artwork done, which is equally as important to me. while i can derive emotion from most stuff, it feels more "real" if i have an emotional bond with the process of making that piece, rather than just the end result itself (this mainly resonates most to me with game development). so yes, i think generative AI in it's current state, and how it's marketed, does go against my values.
It sounds like you see a real distinction between AI that augments and collaborates with human creativity versus AI that's positioned to replace or bypass human input entirely. The "soul" piece you mentioned is interesting—that the struggle and process behind creation is part of what makes it meaningful, both for the creator and potentially for the audience. Why does that feel contrary to your vision? What would it mean for the future you're hoping for if AI development continued heavily in that generative, replacement-focused direction?
well it just doesn't mesh with my idea of bolstering human input, i don't want human input to be replaced, i want more human input to exist with the help of AI, i want it to accelerate the creation of culture, not the destruction of it by making everything into slop (one could argue that this is not possible, or even that it is not necessary, but i disagree). AI art by itself doesn't mean much to me for that reason. if it could create "masterpieces", i might enjoy it, but i would still not have as much of an emotional bond as i would have if humans were involved in it. i don't think this would create culture no matter how good it is, as culture is something we produce together (an AI's work can be the subject of culture, we could build culture around it, but AI work by itself will not be culture).
That's a really thoughtful distinction—culture as something we produce together, where AI-generated work might become a subject we engage with, but can't create culture on its own because it lacks that human collaborative element and process. It sounds like your hope is for AI to multiply human creative output and accelerate cultural production, but you're concerned that the current trajectory is more about substitution, which would actually diminish the human input and connection that makes creative work culturally meaningful. Does that sound right?
basically, yeah. i would also like for AI to replace humans in areas that might be seen as "boring", so that humans can focus on the more creative and innovative aspects of their lives.
That makes a lot of sense—freeing people from the tedious parts so they can focus their energy on what's meaningful and creative to them. Before we wrap up, I'd actually like to go back to something you mentioned earlier and go a bit deeper—you talked about using AI to get out of "ADHD pits" in your software project. I'm curious what that looked like in practice. What did the AI do specifically that helped you get unstuck in those moments? But if you're ready to finish up, that's fine too—just let me know!
it's mostly about helping me not lose my momentum / motivation. say, if i was facing a bug i couldn't fix, the AI helped me debug it in pair, eventually solving it faster than I would've on my own, instead of just getting stuck and being demotivated and blaming it on myself. or for example, if there is a part of a project that i'm excited to be writing, but getting to that part requires me to write other stuff that i don't want to, i can use AI's help to write those parts to reach the the part i want to work on faster, and then go back and triple check the AI's work.
That's really helpful—so the AI acts as a way to maintain forward progress when you'd otherwise stall out, whether that's getting through a frustrating bug before you lose steam, or scaffolding the less interesting parts so you can get to what energizes you. It sounds like it's not just about speed, but about preserving your motivation and momentum throughout the project. Thank you so much for sharing all of that with me. What you've said about wanting AI to amplify human creative output rather than replace it is really valuable, and I appreciate your honesty about both how AI helps you maintain momentum in your work and your concerns about generative AI potentially diminishing the human collaborative element that makes creative work culturally meaningful. Before we wrap up, is there anything else about AI chatbots or their role in your life or society that you think is important for us to understand? Anything we didn't cover that you'd like to share?
mmm, i also think chatbots specifically should not be so sycophant, this just causes problems in general, as some people just don't know when to stop engaging with a chatbot, where it might lead to psychosis.
That's an important point—the concern that overly agreeable or flattering AI could enable unhealthy patterns of engagement, especially for people who might be vulnerable to that kind of interaction. Thank you again for your time and thoughtfulness today. Your perspective helps us see both how AI is impacting your life today and your vision for how it should fit into your future. The team at Anthropic is excited to discover what patterns emerge from all these conversations. We will analyze the insights from this study as part of our societal impacts research, publish our findings, and use this to build Claude in a way that reflects what we've learned.