Although still very busy with the move from San Diego to Palo Alto I’m slowly but surely finding time here and there to code.
Right now, working on finishing the in app purchase view for Starlight. I know, I know… Not the most exciting part, but still necessary. Here’s the latest screenshot, fresh off the press:
As for subscriptions: once the user has reached the free tier limit I’m planning to offer either -most likely- yearly or both yearly and monthly. Price-wise, not sure, I’m trending towards yearly at 34.99 and monthly at 4.99…?
If all goes well (and that’s saying a lot) I’ll ship the next major update in September, delivering all planned 1.0 features, improvements and lots of bug fixes.
Soft-launch of Starlight going great – thank you for all the positive feedback and improvement suggestions!
The app will remain free until I add final touches based on feedback. If you ran into any issues please let me know via the “Contact us” option in the Help menu bar, and if you love the app – ratings appreciated!
On another note, I’m in Barcelona for two weeks – one of them on vacation! If you’re a fellow macOS or iOS developer & want to go for some “canyas” (how locals call small glasses of beer), do reach out.
With all the talk & articles about GPT, a few weeks ago I decided to spend a couple days coding one from scratch.
The training size can’t be the same (for $$$ reasons), but the model can follow a very similar architecture to the one used by GPT-3. As long as we narrow the training data, we can get amazing results – in my case I used US legal corpus data, which is widelyavailable online.
I found this experience very valuable to better understand how this new models can improve & be used in the future. If you want to follow along I do recommend reading the paper GPT is based upon, and also the great implementation from Andrej.
If you have interest in this area, and know the basics of AI / Python, def. recommend the experience. Personally, I think the impact to society will be very significant – although probably not in the obvious ways we’re imagining.
It’s amazing to think all of this is based on math, we’ve come a long way…
Here’s the code I wrote for reference, I optimized it for M Macs to take advantage of their GPUs so it trains a lot faster. Note the faster your GPU the faster it will train, also time will depend on the amount of the training data. I used an M1 Max w. 32 GPU cores and 64 GB RAM, and it took me a about two hours to train.
Even though in their infancy, ChatGPT, Copilot & OPT are already impressive tools. The impact on how we work will be significant and, although expertise will still be critical, people who leverage AI will have an edge over people who don’t.
I started increasingly using Bing Chat (which uses GPT-4 behind the scenes) to parse Apple documentation and help finding solutions when stuck… And it’s been quite helpful.
I now treat Bing Chat, the same I would Stack Overflow or Web Search. Here’s an example on something it helped me on today:
Clearly not perfect, but it led me in the right direction & helped me identify what I needed:
There’s a lot of chat right now about ChatGPT. And, although the technology may be a bit overhyped, some of the use-cases it enables have the potential to change the way we work:
Here’s one, from the Verge:
“Intelligent recap is the big new addition to Microsoft Teams Premium […]. Intelligent recap uses OpenAI’s GPT-3.5 model to generate meeting notes and highlights even if you weren’t present in a meeting.”