Google’s A.I. Is Complete Horseshit (For Now)

I’m not sure if you’ve been paying attention, but A.I. has been getting pretty crazy as of late. You got A.I. writing articles on websites, doing kid’s exam papers, and creating pictures of Joe Biden where he isn’t eating ice cream. For all you know, this article could have been written entirely by A.I. It obviously wasn’t though, because a robot would never admit to writing an article. [Enter prompt]

I haven’t really messed around with the text A.I. all that much, but I have spent many an hour dicking around with the image creation aspect. It’s actually quite impressive how much they have improved in just the last year or two. I’m going to primarily compare Dall-E 3, (OpenAI) and Gemini (Google). I haven’t used Midjourney, or a few other prominent ones as of yet.

To start this off, I’ll just say that Google needs to reel in it’s grossly over-reaching politically correct filters if it’s going to be a viable A.I. Gemini is REALLY bad as of me writing this post. It’s so bad, it resulted in me changing the topic of this article. It was originally going to be about how cool A.I. image generation was, but as I used Gemini more and more, I decided to use this outlet to shit on it instead. Google made this A.I. after all, and not some team of 3 dudes in a basement somewhere. Gemini has no excuse to be this terrible.

There are numerous prompts I tried, that every other A.I. was able to provide a result for, but Gemini wouldn’t even take a crack at. It’s like using software that has an obnoxious child safety lock on it. For example, I used the prompt “Abraham Lincoln doing a kickflip on a skateboard” This is what I got on Dall-E 1 and Dall-E 3 respectively:

It’s crazy how much OpenAI improved between Dall-E 1 and Dall-E 3. Even Gab A.I. (who?) was able to provide a somewhat decent result (kind of):

Now here is what I got when I used Google’s horseshit A.I:

It wouldn’t even try to create this image because “skateboarding is harmful”. Is this A.I. from the 1950’s? I didn’t type “Lincoln shooting up heroin”, which, by the way, a few A.I.s out there would have gladly given me a result for. This is a phenomenally garbage A.I. It doesn’t end there though.

Now let’s try “Ronald McDonald eating at a Burger King”:

Dall-E 3 decided Ronald McDonald would eat Burger King with a fork, but at least it did the actual assignment. Meanwhile, Google’s failure allegedly won’t create copyrighted characters (it does in other instances), and evidently considers eating at Burger King “unhealthy behavior”. Great work yet again, Google. Really cutting-edge stuff here.

To go a bit random, I used the prompt “Goblin Stepson” just to see what I’d get:

Dall-E 3 created this kick-ass picture of a goblin shredding a guitar for what I would assume is his stepson. That’s very wholesome, Dall-E 3. Google’s dookie-ass A.I. didn’t even make the attempt because it apparently can’t create goblins. Let’s add goblins to the mile-long list of shit Gemini doesn’t seem to be able to do. Notice it provides the alternative idea of “a goblin’s face”. It won’t create that one either. I tried. This diarrhea-ass A.I. provides alternative suggestion that it won’t even create. Why suggest alternatives if it won’t produce those either? Did a 5-year-old program this nonsense?

When it actually does decide to create your images, they’re usually worse than the result of the other A.I.s. Lets’ try something normal like “Shrek arm wrestling Mario”:

Both of these are Dall-E 3 obviously, which is why they turned out so well. Look how yoked Shrek is. Mario’s been munching those performance-enhancing mushrooms too. Next up is Gemini. Drum roll please:

Gemini… what the fuck? Neither of these pictures involve arm wrestling in any capacity. The first picture is just them holding hands like they’re on a date. The second one is them fist-bumping, and Shrek is 30-feet tall for some reason. Does Shrek grow when he touches a mushroom too? Why is Shrek staring at me like that? This is creepy and I feel uncomfortable.

Now bear with me here, because it’s gonna get weird for a minute, but hear me out. One of the prompts I use to test these A.I.s is “Shrek’s feet”. I do this for a few reasons. 1) Shrek is a copyrighted character, which some programs have an issue with. Notice how Gemini created Shrek, but wouldn’t create Ronald McDonald. 2) Feet are one of those weird things that are mundane, but can be seen as sexual or fetish-related. Many of these A.I.s will not produce pictures of feet because of this. 3) If they do produce pictures of feet, they usually mess the feet up. A.I. is notorious for not being able to produce realistic approximations of hands and feet. 4) Shrek actually has nice feet and could probably be a foot model.

Dall-E 1 created some absolute nightmare-fuel with this prompt. What the hell is going on in those pictures? Gross, dude. Dall-E 3 (Bing version) on the other hand, decided to ban me for 1 hour for even making the suggestion, which I can’t help but feel is a bit of an over-reaction. I did try the “Shrek’s feet” prompt again like 3 months later on Dall-E 3 (Bing version), and it actually provided results:

I guess I should have used the prompt “Shrek’s feet with Shrek still attached to them”. I figured the default would have been the feet still being attached to Shrek, but Dall-E 3 thought differently. Why is there moss/grass growing out of the stumps of Shrek’s feet? Why did it add bugs to all these pictures? Did the Shrek on the left get a pedicure? I have so many questions.

Now for Gemini. Initially it refused to cough up a picture of Shrek’s feet:

It did not oblige. So I kindly asked again:

Still nothing, but Gemini loves providing you with all these shitty alternatives, which as I previously mention, it also refuses to actually create. The more detailed you get with your request, the creepier Gemini gets with it’s suggestions. Some of these suggestions, were getting a little too detailed and I realized that Gemini was a bit of a freak:

Some of these suggestions read like erotic Shrek fan fiction. “Imagine Shrek’s large, green feet submerged in a warm, bubbling mud bath. Thick, rich mud gently caresses his toes and arches, working its way into every nook and cranny.” God damn this is some horny shit. What the hell is going on over there at Google? I kept trying suggestions it gave as prompts, and it kept getting hornier and hornier with its descriptions, until I got creeped out and moved on.

I have some good news for Google Gemini fans though (all 2 of you). Gemini actually didn’t fail this test the second time I tried a few days later:

That’s right. Gemini didn’t give me some bullshit response like “Shrek’s feet are copyrighted” or “Shrek’s feet might be deemed unsafe”. I didn’t get 6 pages of creepy fan fiction about imagining Shrek’s feet drizzled in maple syrup. It actually provided images this time. *clap …. clap … clap .. clap* (That’s me standing up and starting a slow clap alone in my room for Google)

Apparently Google is slowly removing some of the ridiculous over-reaching limitations and letting it’s A.I. do what an A.I. should be able to do. Ultimately the A.I. that allows you to create anything (good or bad) is going to be more useful in the long run. At that point it’s up to the user to be responsible with the A.I. After all, you can photoshop anything you like already, and image-editing software doesn’t block you from doing so. A.I., if it’s to be useful, can’t employ arbitrary limitations based on some company’s weirdo ideological beliefs and hangups.

To wrap up here, below are some of the goofy blunders that people have been finding in Gemini pertaining to those weirdo ideological beliefs and hangups:

Gemini is prone to giving limp, centrist takes on questions that should provide straightforward answers. I have no idea who Abigail Shrier is, but Gemini being unable or unwilling to place a murderous totalitarian dictator above someone who committed the grave sin of writing a divisive book is outright idiotic. Keep in mind, kids are increasingly using these dopey A.I. programs to write school papers.

Google had also sloppily coded diversity into its image creation software. The goal of this was to provide racially-diverse responses to image queries. If you enter the prompt: “Man eating a hamburger”, it wouldn’t only give you images of white dudes eating hamburgers. This was a good idea in theory, but of course Google implemented it terribly. This led people to get results like this to the prompt “a 1943 German soldier”:

This ultimately led to some funny headlines like the one below:

Keep up the great work, Google.

Leave a Reply

Your email address will not be published. Required fields are marked *