- ChildsPlayAI
- Posts
- Let's be honest about our feelings....
Let's be honest about our feelings....
Welcome back friends and new AI enthusiasts to ChildsPlayAI, a weekly newsletter where we break down what’s happening in AI, in language you can understand.
Over the past few months, sites like ChatGPT have seen a decrease in users, calling into question why consumer interest is down. Experts say that this is just the beginning, and there will become a larger focus on user-friendly applications. What we do know, is this is NOT A FAD, so read up!
In Today’s lineup:
Deeper concerns around misinformation and the elections
What is AI Anxious and are you feeling it?
Let’s dig in.
AI and Election Concerns
Unless you’ve been hiding under a rock, you’re well aware that election season is upon us. But if the last election cycle stressed you out, then you may want to find your happy place before diving deep into the newest tactics. From the jockeying in debates to the strategic release of fake photos, we are only seeing the tip of the iceberg on what will become the new political strategies of the 2024 elections.
Two months ago, Eric Schmidt, the former CEO of Google, warned that the 2024 elections could be chaotic, due to social media's inability to stop the spread of false information created by AI. Schmidt is worried that this new technology could be used to spread unprecedented levels of misinformation, making it difficult to distinguish between truth and lies.
Essentially, what he’s saying is that the technology to regulate the spread of misinformation is far behind our ability to create the content in the first place. There is even legislation that’s been introduced in the House that would require candidates to label campaign advertisements created with AI. I know you’ll be shocked to hear this, but even that has not been agreed on.
While all of these tech companies look to make things safer, more transparent, and blah blah blah…. in the meantime, be vigilant and use your noggin! If something seems totally ridiculous, it might be! Remember what we were taught as children… “don’t believe everything you hear” That couldn’t be truer today.
Read more here
Talkin bout feelings!
A recent study by LinkedIn of almost 30,000 professionals dug into both their sentiments around AI and their actual knowledge on the topic. Some of the opposing findings show:
84% of US LinkedIn members are in jobs where generative AI could be used to automate at least a quarter of repetitive tasks and increase productivity.
45% of global professionals believe AI will make their jobs easier
BUUUUUT
Half of professionals globally are worried they should know more about AI than they do.
39% are feeling overwhelmed by the amount of change AI may bring to their jobs in the future.
Nearly 40% of global professionals have admitted to pretending they know more about AI to seem ‘in the know’ in front of teammates.
How should you prepare yourself?
Realize YOU ARE NOT ALONE!!! We started this newsletter for the very reason to help break down AI into its basic form so that we (us included) don’t get left behind! We’re in this together!
(Speaking of which…. Don’t forget to use the POLL at the 👇️ to let us know how we can help you better!)
Start by learning the absolute basics. I promise if you can grasp stuff like algorithms and machine learning, a lot more things will begin to click
People skills are at a premium. Don’t forget that you are a human being and you didn’t get this far without being pretty damn good. Problem-solving, strategic thinking, and time management should become top priorities.
And if you can’t get behind these three steps, our old friend Stuart Smalley would say “You need a checkup from the neck up.”
Sweet Reads
Improve your Life-Hacks
You know you’ve always wanted your own AI assistant
So you say you want an AI assistant right? Well here’s your chance to build your own using GPT technology. Develop a personalized AI chatbot assistant, either private or public, that seamlessly connects to all your data and integrations.
We could use a little hand here…..