Harnessing the Potential of Machine Learning for Writing Words for My Blog

| Comments

It is probably obvious that I asked ChatGPT to assist me in writing the title for this blog. I am not terribly pleased with it. This title is about half way in between what I originally wanted and what ChatGPT suggested. Since I am writing about using artificial intelligence to assist me with my work here, it seems like I should let it help out from the very beginning of the task!

Stable Diffusion Coffee Guy

If you are looking for an article to tell you exactly how to leverage things like large language models to make your work easier, then this probably won’t be the blog post for you. I am only starting down this road. It has been bumpy, and I don’t know if I am doing a good job.

I have already had some success, so I figure this is a good time to talk about what I have done so far, and to tell you where I hope this journey takes me!

You need to be using Stable Diffusion

I have been using Stable Diffusion to generate images akin to stock photos for my blog for most of 2023. I don’t know that I can say that it has been a game changer, but it has been a huge help!

Blog posts look better when there are images to break up the wall of words, and nice photos make things feel more inviting. The images help to anchor your eyes. When you decided to scroll back to reread something important, you are more likely to find the words quickly if you remember them being near a particular photo.

Stable Diffusion NVMe Guy

On occasion, I find that I just haven’t taken enough photos. There are also a lot of times when I am writing about a topic that doesn’t actually exist in the real world, so there is no tangible subject matter available to take a photo of. It is amazing to be able to give Stable Diffusion a prompt like nefarious hacker stealing a laptop. I can ask for 400 images, and they’ll be ready by the time I finish making my latte. I can flip through A.I. images for five minutes to find one to spruce up a blog about [running Octoprint through a networked serial port]nsp].

Initially, I was using Stable Diffusion on other people’s computers via the Stable Horde open-source project to generate images. Images took quite a while to generate, and I could only ask for a few at a time. I have since followed Automatic1111’s guide to get Stable Diffusion running locally, so now I can use my Radeon 6700 XT to easily generate 400 images in less than ten minutes.

I also got some tips from watching our friend Novaspirit Tech’s videos. I have downloaded other models besides just Stable Diffusion 1.5, and I am able to generate cartoon images now. Those are a lot of fun!

I am not trying to fool anyone. I don’t work hard to generate images that look real or correct. I embrace the absurdity. Does the guy have three thumbs? Is there a misspelled approximation of a company logo on the NVMe the cartoon character is holding up? Then it is a fantastic image!

There are Stable Diffusion images sprinkled all over this post. I didn’t generate them to go along with the words. I am just picking some of my favorites that I didn’t wind up using in other blogs.

Dipping my toe in the ChatGPT waters

I wrote a new blog last week and I pasted every single paragraph into the free ChatGPT web interface. I tried telling it to rephrase this to make it more interesting, but that made for some extremely pretentious writing. I know I tried a few other adjectives besides interesting, but I wound up settling on just prefacing every paragraph with rephrase this.

I purposely overdid things. I did my best to replace at least one sentence in every paragraph with something straight out of ChatGPT. This was tough, because ChatGPT’s output doesn’t usually feel like my style.

Manually copying and pasting every paragraph into the ChatGPT web interface was tedious. I don’t time myself, but that definitely felt like the slowest blog I have ever written, and I was hoping that using ChatGPT might speed things up a bit!

What do you get if ChatGPT rephrases an entire paragraph?!

I should give you an example. This paragraph has been rephrased by ChatGPT 3.5 Turbo. There we be a copy of the original paragraph quoted below this one. Does this sound more pretentious than the rest of the blog? Would anyone actually want to read 2,000 words written in this style?

May I provide an example? Below is the rephrased version of this paragraph, generated by ChatGPT 3.5 Turbo. Do you think this excerpt sounds more pretentious compared to the rest of the blog? Would anyone have any interest in reading such a lengthy piece written in this manner?

When I write, I try to imagine that I am speaking to you. We might not be having a conversation, but maybe I am up in front of a classroom. I don’t want to sound like a textbook, and I definitely don’t want to sound like a fancy novel.

What if we ask ChatGPT to rephrase that paragraph in the style of Douglas Adams?

Allow me to present an illustration. See, this very passage has been tweaked by ChatGPT 3.5 Turbo. Right after it, you’ll find a replica of the original paragraph. Now, does this sound excessively pompous in contrast to the rest of the blog? Dare I say, would anyone possess the inclination to peruse a hefty 2,000-word composition crafted in this fashion?

That would never fit well on my blog, but it is awesome, isn’t it?!

Optimizing my use of ChatGPT a bit

I signed up for an OpenAI account, and I installed the chatgpt-arcana modules for Emacs. Then I wired up a key binding that would send any selected text to ChatGPT while automatically adding rephrase this to the prompt.

The chatgpt-arcana plugin isn’t quite what I want, and it is a little buggy for me. I can usually hit my key bind over and over with the results showing up in the same window. Every once in a while something changes, and it wants to split windows again, and I wind up having two ChatGPT windows side-by-side. When it works, it is way nicer than the context switch of manually pasting things into a web browser!

Isn’t ChatGPT is supposed to make me work faster?!

So far, it does the opposite. I am spending so much more time shipping paragraphs over to ChatGPT, reading through the results, and cherry picking phrases or sentences to swap out.

Asking ChatGPT to take a blog title and give me back ten related titles might be saving me some time. Instead of massaging the words myself, ChatGPT will reorder the words, pick out some synonyms, and definitely give me options I wouldn’t have considered on my own. This is definitely faster than the usually stewing I do over a title.

I am confident that there are ways one can use ChatGPT to save time while writing, but I have yet to discover them.

I feel that ChatGPT is improving the quality of my writing

While writing that first blog with the aid of ChatGPT, I really wanted to see what it would suggest for each and every paragraph that I wrote. That proved to be quite time-consuming. Now, I only send paragraphs over when something just doesn’t feel right.

Stable Diffusion Podcast Hosts

Sometimes I am repetitive on purpose. Sometimes I write short sentences with the same simple structure. Sometimes it fits well. That felt really forced, but I imagine you get the picture.

Other times repetition is accidental. I might overuse a particular adjective or verb. I usually look for that on my own, but now I can just send the paragraph to ChatGPT just to see what she does with it.

Is the ChatGPT API expensive? Should I be running a local LLM?!

I was curious about how much this would all cost. It turns out that chatgpt-3.5-turbo may as well be free. I went through a blog that was nearly complete, and I sent every paragraph through ChatGPT at least one. Between that 2,000 word post and all my various testing to tune in my Emacs plugin, I have accrued $0.03 in charges, and I can tell by the graph that they had to round up to get me to that third penny.

They have yet to let me try GPT-4. I am excited about trying it out. It only costs about four times as much as GPT-3.5 Turbo. Four times very nearly free is still nearly free! I hear it is a good bit slower, and that might be a big disappointment. I am getting decent responses back today in just a couple of seconds. Would I be willing to wait longer? Would I use ChatGPT less often if it took two or three times as long to get a response?

Stable Diffusion 3D Printer Guy

I did some very basic research into running a large language model locally. The consensus seems to be that Llama 2 70b is more or less comparable to chatgpt-3.5-turbo. My understanding is that 70b is too big for a single 24 GB GPU.

That isn’t the end of the world; you can easily find a pair of older server-grade Nvidia GPUs for just over $400. That’s not too bad, right?!

Later, I found out that Llama 2 70B might only run at around 30 or 40 tokens per second on a pair of Nvidia 4090 cards. It isn’t going to run anywhere near that fast on a couple of $200 Tesla P40 cards from eBay, and 40 tokens per second is already molasses compared to using OpenAI’s API.

Llama 2 and ChatGPT are moving targets. The stack for running Llama 2 locally keeps getting faster, but ChatGPT is also being optimized.

There are definitely privacy concerns when sending all your words up to a cloud service, but they aren’t relevant for my use case. Everything I send up to ChatGPT is going to be published on my blog a couple weeks later anyway.

ChatGPT vs. a human copy editor

For the last ten years, Brian Moses and I have been paying an actual human being to proofread everything that we write. Our fantastic editor has been worth every penny, and she will continue to be an important part of our writing processes for as long as she allows us to keep paying for her services!

Our editor has definitely improved my writing. I made a lot of the same exact grammar mistakes for years. Every time I read her corrections, I wind up thinking about them, and I am ever so slightly less likely to make the same mistakes next time. Once enough time goes by, I almost stop making those mistakes altogether. I know for certain that there are a lot fewer red marks on my posts today than there were five or ten years ago.

I can already see that ChatGPT is going to have a similar impact on my writing. ChatGPT will replace words and phrases with alternatives. Sometimes I hate the replacements, but every so often I like them quite a lot. When I like what ChatGPT tells me, it is going to make an impression on me, and I expect that I will start making small changes to my writing without even thinking about it.

A decade of experience tells me that everyone should have a human editor. I suspect having a robot assisting me for the next decade will be nice, but I definitely don’t want to live without the human editor. She is worth every penny, even though she costs more than ChatGPT!

What’s next?

I think this will be a fascinating question to learn the answer to! I am content, at least for the moment, to just do what I have been doing, but hopefully do it more efficiently. Asking ChatGPT to rephrase every paragraph I write is time consuming. Learned when to ask will streamline things quite a lot.

I am aware that ChatGPT could do a better job if I gave it more context. Even gpt-3.5-turbo can hold 4,000 tokens. That is enough space to feed it an entire blog post when asking it to rephrase just one paragraph. If I have blogs that don’t fit, I could pay a bit more for a model with a higher token limit. This seems like an interesting next step, but I am not in a hurry to figure out how to get chatgpt-arcana to do that automatically for me. Sending an entire post for context will probably slow down by queries, but maybe I can set things up so I only take that hit on the first query of each session.

What do you think? Am I only just barely scratching the surface, or have I already dived deeper into Stable Diffusion and ChatGPT waters than I realize? Are you using machine learning to generate stock photos for your articles or using a large language model to help you with your writing? Should I be doing something completely different? Let me know in the comments, or stop by the Butter, What?! Discord server to chat with me about it!

chatgpt-arcana.el on GitHub”

Comments