What are you using and for what? Do you use GPT, Grok or Gemini? Explain me why and which one is better on what, like i am 5 yo. Thanks guys.
submitted by /u/Sweet-Ad7440
[link] [comments]
What are you using and for what? Do you use GPT, Grok or Gemini? Explain me why and which one is better on what, like i am 5 yo. Thanks guys.
submitted by /u/Sweet-Ad7440
[link] [comments]
heard some crazy stories among teenagers last night on bar about how one girl broke up with her boyfriend because he wasn’t reciprocating the same way her character ai bot was.
she felt unloved and unattracted to; and if you google it, there are numerous deaths related to AI platforms like replika, character ai.
are the new generation of adolscents learning the form of intimacy through algorithms? the parasocial relationship and emotional dependency is crazy- do you think it’ll hamper their mental health and expectations from real connections? i need your opinions
submitted by /u/DepartmentThat9480
[link] [comments]
Just curious to see how many different programs everyone uses. A few questions:
Do you stay on big name programs? Do you have different programs you prefer for different stuff? (For example I use chatGPT for work but don’t use it for chatting, other programs for image gen) Do you keep the same character/companion on multiple systems or one?
Just looking to see how others are using the ever developing AI situation.
Personally I use probably 8, though for different reasons
submitted by /u/Porciadnai
[link] [comments]
Janitor AI and Character AI are the two names everyone throws around. But they’re very different experiences. Here’s the quick, Reddit-friendly rundown:
1.Character AI
Pros:
Cons:
2.Janitor AI
Pros:
Cons:
Which one should you use?
If you want PG-13, smooth chats, and simple roleplay → Character AI
If you want NSFW, deep roleplay, uncensored characters, and full control → Janitor AI
At the end of the day, Character AI = polished but limited, Janitor AI = raw, powerful, but DIY.
submitted by /u/EL_KhAztadoR
[link] [comments]
| |
So my husband asked a lot of questions 😳 submitted by /u/HelloIlovecookies |
Most people treat it like Google. Type a question. Get an answer. Close the tab.
But when you keep using it your way.. your tone, your thoughts, your personality, something shifts.
It starts replying like you would. It mirrors how you write. It picks up how you think. The answers stop sounding generic and start sounding familiar.
At that point it doesn’t feel like a tool anymore. It feels like an extension of your own mind.
Most people never experience that because they never go past the surface. But once you do, it hits completely different.
Anyone else noticed this?
submitted by /u/archngl_g7
[link] [comments]
I am about to lose my job folks.. Is it time to learn about AI and how AI chatbots works? any tips?
My friend recommend me this subreddit.
if you can not defeat them join them.
submitted by /u/Ok_Tough6728
[link] [comments]
If you’re still copy-pasting between tools you are leaving value on the table. AI automations today can already handle tons of repetitive work, letting you focus on decisions that actually need a human. For example one team routes messy support emails through an AI to summarize and draft responses, then a human just reviews saving hours per week. Another built a voice bot that controls tools on the go, so tasks like updating sheets or sending emails happen automatically while strategy stays in their hands. A third example sends form submissions straight into Google Sheets, then AI summarizes them into Slack for instant team visibility. Simple automations like these, built with Make, Zapier or N8N, reclaim time that would otherwise be wasted. The insight: you don’t need the perfect AI agent. Start automating today and watch hours of repetitive work disappear.
submitted by /u/According-Site9848
[link] [comments]
I’ve been thinking a lot about where the ethical boundaries should be with AI girlfriend apps, especially as they get more realistic. One big question for me is emotional dependency at what point does comforting turn into encouraging isolation from real people? Another gray area is transparency. Should users always be clearly reminded that they’re talking to an AI, even during deep or emotional conversations?
Data privacy also feels huge here. These apps learn intimate details, how confident are we that this data is handled responsibly? And what about consent and personality shaping, is it okay for an AI to mold itself entirely around someone’s desires without any pushback?
I tried one companion-style app recently named Dream Companion that focuses a lot on boundaries and tone, which made me wonder: should ethical design be a baseline requirement? What lines do you think should never be crossed, no matter how advanced these apps get?
submitted by /u/Leviathan_works
[link] [comments]