Inside the "3 Billion People" National Public Data Breach (15 minute read)

National Public Data is a data aggregator that provides services based on the large volumes of personal information it holds. These services are used by investigators, background check websites, data resellers, mobile apps, applications, and more. The company was breached and its records started being leaked in April. The records were publicly posted last week. This article takes a look at what the data contains. There are no email addresses in the social security number files, and at least some of the data is not associated correctly.

A California Bill to Regulate A.I. Causes Alarm in Silicon Valley (8 minute read)

The new bill, if signed into law, would require companies to test the safety of powerful AI technologies before releasing them to the public and allow California's attorney general to sue companies if their technologies cause serious harm.

Musk's new Grok upgrade allows X users to create largely uncensored AI images (4 minute read)

Grok does not appear to refuse prompts involving real people and it doesn't add watermarks to its outputs.

Epic judge says he'll ‘tear the barriers down' on Google's app store monopoly (2 minute read)

Epic has asked the court to force Google to let rival stores live inside Google Play.

Google's AI Search Gives Sites Dire Choice: Share Data or Die (12 minute read)

The tool Google uses to sift through web content to come up with AI answers is the same one that keeps track of web pages for search results. Sites that block Google's AI bot may not show up in search. Publishers either have to choose to offer up their content for use by AI models, which could make their sites obsolete, or disappear from Google search, a top source of traffic. Google has signaled to publishers it is not interested in negotiating data-sharing deals and media companies have little leverage in the situation.

Tesla is hiring people to do the robot (2 minute read)

Tesla is employing people to wear motion capture suits to help train its humanoid Optimus robot. The position pays up to $48 per hour and requires walking for up to over seven hours a day while carrying up to 30 pounds and wearing a VR headset for extended periods. Employees must also be between 5'7" and 5'11" tall. Optimus may require millions of hours of data before it's fully ready to work in Tesla's factories.

Algorithms we develop software by (5 minute read)

Becoming a better engineer is about becoming a better pathfinder in problem space. Engineers get better at solving problems by writing more code. One way to do this is by rewriting code - rewriting code takes significantly less time than the initial implementation and the results are usually much better. Breaking assumptions can help engineers approach problems differently and result in more efficient solutions.

OpenAI's Most Advanced Model Can Now Be Customized. Here's How (2 minute read)

OpenAI's GPT-4o and GPT-4o mini can now be fine-tuned and customized by developers for business use. Developers can now use their own datasets to enhance the model's knowledge base with proprietary information and control how the model responds to specific questions. It costs $25 for every 1 million tokens used to fine-tune GPT-4o - $3 for GPT-4o mini. 1 million tokens is roughly equivalent to 2,500 pages in a standard-size book.

North America added a whole Silicon Valley's worth of data center inventory this year. It's not enough (2 minute read)

The eight primary data center markets in North America added 515 megawatts (MW) of new supply in the first half of this year. All of Silicon Valley has 459 MW of data center supply. Data center space under construction is currently at a record high. The vast majority is already leased. The insane amounts of data center capacity being built are still not enough to meet the growing demands of cloud computing and artificial intelligence providers. This demand has raised national rental rates by 6.5% on average.