Home 2023.07.07
Post
Cancel

2023.07.07

You don’t have to read it, but you just might learn something.

Leading Thought

Because it bears repeating: Go easy on yourself. You did the best you could. You made the best choices you could make with the information you had at the time. There isn’t any more to it. Go easy.


Prime

ChatGPT: A New Chapter in Misinformation & Security

This was a really interesting talk not only about ChatGPT, but Large Language Models (LLMs) and a more in-depth look at how generative AI works. There is also a great segment on the overall cost of running ChatGPT, as well as other AI systems. While you’ll need to register to get a passcode to watch, it’s definitely worth some time to watch.

Now, you may be aware of some of the controversy around AI-generated art, fiction, possibly screenplays, or other derivative work. You may have shrugged it off as a non-concern because, let’s face it, all art is derivative, right? Now let’s talk about code. Let’s assume that tools like GitHub CoPilot are trained on the repositories in GitHub, even the private repos. If the AI gives you a suggestion derived from someone else’s private or proprietary work, is this help, or is this stealing? All that artistic stuff starts to feel a little closer to home.

You may also be aware of the hidden costs of these LLMs, from power required, to Post Traumatic Stress induced on people tagging data (see Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic). But what does that really mean? One estimate is that a single chat with ChatGPT has a 1000x increase in cost over a standard Google search. Aggregated, it’s estimated that, in computing costs alone, ChatGPT 3.5 costs OpenAI around $700,000 per day handling 10 million users per day; for those able to use ChatGPT 4, the cost has a 10x increase over 3.5. The reality that they either aren’t already, or will soon be making money from you as the product, is pretty slim. As the old saying goes, “There’s no such thing as a free lunch”. Better, maybe, Caveat emptor.

Getting to some of the security, there were some interesting cases discussed. Prompt Injection is manipulating the system to return results or perform actions that should be prohibited – think making bombs or similar. Even more so, one of the speakers describes a case where a resume scanning system might be hacked with with a bar of white text or a really tiny font that only the AI could read. Top that with the ability to easily pump out thousands – or even millions – of fake news articles that corroborate each other, and the race between the hackers – both good and bad – is on.

…you can cause a huge amount of harm with automation, with any level right? Like I could just write a little program in BASIC that – like an if-then type of thing – ‘if you see a target, launch a nuclear strike and end the world’ basically. I mean, you know, I’m being kind of facetious and extreme, but the point is, don’t fall into the trap of thinking it’s the size of the model that makes – that correlates – necessarily with risk, right? Even a very simple model – someone just mentioned war games – even a very simple model can cause a tremendous amount of harm.

Exclusive: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic

You may have had an opportunity to play with ChatGPT. Maybe asked some fun questions, or asked for something silly, like explain something as if it were written by Shakespeare. You may not even have realized that your answers were probably safe for general consumption – that is, the AI probably didn’t come back with something racist, or worse. It’s amazing tech. Or is it?

Imagine get paid slightly above minimum wage in the US to spend your day peering into the darkest corners humanity has to offer. To spend hours on end reading comments about abuse, violence, or worse; look at pictures of the same. Day, after day, after day, after day. And then you get to go home to a family, if you’re lucky, or maybe you’re forced to sit with that by yourself.

Similar to the way Facebook and other social media companies try to keep the most heinous content off the platform, OpenAI has to rely on real people to tag data for the AI to recognize it. Most of the time, these people live in places with little opportunity and jump at the chance to do a job that might be tedious at best in order to survive. They spend their days helping to filter out the worst humanity has to offer so that we are able to safely have clean coding explained in the style of the King James Bible, oftentimes without receiving nearly enough counseling and support (if any), to deal with the trauma inflicted upon them.

Barely more than minimum wage.

This is another one of those important reads for everyone to understand that behind the AI are real people doing evaluation of data, and behind them are legions of people creating content that is the worst humanity can devise (take note, there is a content warning at the beginning of the article). Read the article. Understand the cost that safe social media and AI come with. Then maybe consider if the tradeoff is worth it.

“We were told that they [Sama] didn’t want to expose their employees to such [dangerous] content again,” one Sama employee on the text-labeling projects said. “We replied that for us, it was a way to provide for our families.”

How design is governance

Interesting article here about how design governs the experience of the customer and is, in many cases, either not well-understood, or driven by the wrong motivation. How often do you go to an app or website and find that it is more a barrier to what you want to get done than a help? Now imagine you are the organization who has created a novel app that gives your target audience this experience. You may be sealing your own fate.

Using some good examples from the real world, the author lays out a compelling argument for why we need to be less focused on engagement and more on disappearing into the background. Think of the things in the world you find most useful, whether they are technology or simple everyday things. Chances are, those things require little thought to use or effort to pay attention to – a tea kettle is a great example: fill it with water, heat it, walk away; it let’s you know when it ready for you to interact with again.

If you are involved in building software you want people to use happily, this is pretty much a must-read article to whet your appetite. You may already have a good idea what this is about if you’ve read Thinking in Systems or The Design of Everyday Things, but still worth a read as a reminder about how to build better for our customers.

All this culminates into a consumer experience where little about it can be fundamentally changed. And it’s nigh impossible to seek redress with the app developer. When angered by a poorly designed app, customers are trapped in a space that reminds me of the title of a classic Harlan Ellison short story: “I Have No Mouth, and I Must Scream”.

We have left the cloud

Interesting post here from David Heinemeier Hansson (DHH) about making the decision to move 37Signals out of AWS and back onto their own hardware (though in a fully-managed data center). The projection is that they will realize massive savings over their current cloud budget of $3.2M per year.

Now, you’re likely thinking about all the manpower, benefits, and everything else that The Cloud promises to equalize or even save money on. I mean, that’s the dream, right? But as DHH points out, the complexity that comes with being in The Cloud, as well as the reality of volatility and predictability for most large, stable companies never equalizes and costs much more in reality. This doesn’t mean that there aren’t very valid use-cases for Cloud vendors, but the wholesale jump may not have been the right one for a lot of companies.

Give this article a read, especially if you hold the opinion that The Cloud is better for your company. Included are lots of links and additional information, including how being in containers actually helped make the move back on-prem easier. You may just find it worth doing an experiment to see if moving some or all of your applications back in-house makes more sense for your bottom line.

The main difference here is the lag time between needing new servers and seeing them online. It truly is incredible that you can spin up 100 powerful machines in the cloud in just a few minutes, but you also pay dearly for the privilege. And we just don’t have such an unpredictable business as to warrant this premium. Given how much money we’re saving owning our own hardware, we can afford to dramatically over-provision our server needs, and then when we need more, it still only takes a couple of weeks to show up.

Return to Top


Coming Soon

Beer City Code

(August 4 - 5, 2023 | Grand Rapids, MI)

Definitely check this dev conference out – lots of names you may recognize: Mike Eaton, Cassandra Faris, Chris DeMars, and Rocket Homes own Chris Woodruff!

Black Hat USA

(August 5-10, 2023 | Mandalay Bay / Las Vegas + Virtual)

Infosec your thing? Then check out this conference in Vegas. There are two day classes available, as well as briefings demos, and more.

Black Is Tech 2023

(In-Person (Atlanta, GA): August 9 – 11, 2023 | Virtual: August 7 – 9, 2023)

The Black Is Tech Conference is a platform that connects Black tech professionals, students and entrepreneurs and provides access to resources for growth and development for these groups.

Return to Top


Humble Bundles

The Salesforce CRM Certification Mega Book Bundle

New offering from Humble Bundle benefitting Covenant House – and, if you don’t know it’s there, there is an Adjust Donation button that will let you give more of the take to charity! For a minimum donation of $25 you get 20 titles, including:

  • Salesforce Data Architecture and Management
  • Salesforce for Beginners - Second Edition
  • MuleSoft for Salesforce Developers - First Edition
  • Hands-On Low-Code Application Development with Salesforce
  • Mastering Apex Programming
  • And more!

The Ultimate Guide to React.js Book Bundle

New offering from Humble Bundle benefitting Covenant House – and, if you don’t know it’s there, there is an Adjust Donation button that will let you give more of the take to charity! For a minimum donation of $25 you get 22 titles, including:

  • React and React Native
  • Full Stack FastAPI, React, and MongoDB
  • Mastering React Test-Driven Development
  • ASP.NET Core 5 and React
  • Full-Stack React, TypeScript, and Node
  • And more!

The Complete Python Mega Software Bundle

New offering from Humble Bundle benefitting Children’s Miracle Network Hospitals – and, if you don’t know it’s there, there is an Adjust Donation button that will let you give more of the take to charity! For a minimum donation of $25 you get 45 items, including:

  • Beginners Guide to Coding in Python (20 Hours)
  • Python and NumPy for Data Science: Mastering the Foundations
  • Python Mastery: Visualizing Data with PyPlot
  • From Excel to Python: Mastering Data Science Automation
  • Introduction to Data Analysis: Pandas and Python for Beginners
  • And more!

CyberSecurity: Zero to Hero Software Bundle

New offering from Humble Bundle benefitting World Wildlife Fund – and, if you don’t know it’s there, there is an Adjust Donation button that will let you give more of the take to charity! For a minimum donation of $25 you get 22 items, including:

  • The Beginners 2023 Cyber Security Awareness Training Course
  • Writing Secure Code in ASP.NET
  • The Complete Ethical Hacking Course
  • Software Security Testing
  • Pentesting Fundamentals for Beginners
  • And more!

Return to Top


DevOps

Nmap Command Examples For Linux Sys/Network Admins

Nice rundown of Nmap (Network Mapper) and how to use it’s features for different results. If you are interested in Infosec, this can be a handle tool for both red and blue teams.

While it’s noted in the post, it never hurts to repeat warnings like these: Port scanning may be illegal in some jurisdictions. So setup a lab as follows to run nmap examples for learning purposes

Return to Top


Engineering

fsharpConf 2023

I managed to miss this in the Coming Soon section but, fortunately, fsharpConf 2023 was recorded and is available on YouTube. If F# is something you are interested in, check it out here. If nothing else, fast forward to the talk by Mark Seemann – he’s always worth listening to.

Rebuilding a comment component with modern CSS

This is a really cool breakdown of building out a nested comment component. While CSS is still a skill I need to (re)build, the author here does a great job of explaining his thought process on how to achieve the desired effect piece-by-piece. So much so, that while I may not have the experience to do this on my own, I was able to follow along pretty well.

Return to Top


This post is licensed under CC BY 4.0 by the author.