AITrends

How to Use AI with the Mac Notes App

During this year’s WWDC24 Apple announced exciting new AI features that will be integrated into their upcoming operating systems across various devices, including iPhones, MacBooks, and iMacs. However, these features won’t be available until late fall in the U.S., and EU users won’t see them in 2024 due to privacy and security regulations related to the EU Digital Markets Act (DMA).

Readmore Apple may delay AI features in the EU because of its big tech law 

From the preview, it’s clear that Apple’s AI approach aligns with what many of us have wanted: AI capabilities built directly into the operating system, eliminating the need for specific services or websites. Most of this AI technology will work offline, directly on your device.

While we’re still waiting for these features to be available, there’s a way you can get similar AI capabilities on your device right now, offline, and for free. Recently, I’ve been exploring offline language models and discovered many options – both open-source and closed-source – depending on your device and operating system. Linux offers the most options, but I found something special for macOS users.

Ollama

Ollama is a tool that’s recently gained attention, especially after Meta’s new release on July 23rd, which enhances open-source models and competes with closed-source technologies like ChatGPT and Gemini.  With Ollama you can run your own text generation model offline, ondevice and after testing Ollama, I was able to integrate Ollama with the Mac Notes app, allowing it to perform tasks like generating content, summarizing long texts, creating basic code, and rephrasing text—all offline.

Requirements:

  • Mac: Apple Silicon (M1 or later)
  • OS: MacOS 11 Big Sur or newer
  • RAM: At least 8GB, more is better
  • Storage: Over 20GB of free space recommended
  • Apple’s AI Integration Announcement

How to install Ollama and connect to your notes app

  • Visit Ollama website and download the macOS version. Make sure your Mac runs macOS 11 Big Sur or later.
  • After downloading, open the installer file and follow the instructions to install the command-line tools.
    Once installed, open Terminal and run the command below. This command downloads the Meta Llama 3 8B chat model, which is about 4.7 GB in size.
ollama pull llama3
  • To test if it works, run:
ollama run llama3

Integrating with Mac Notes:

brew install --cask notesollama
  • Launch the NotesOllama app and then Open your Notes app, type or paste your text, then press Control + A to select everything. This activates the NotesOllama service, which will show a wand icon at the edge of the Notes app. Click this icon to choose how you want the AI to assist you.

  • And just like that, you have AI integrated onto you Notes App with the entire setup running totally offline and for free.

If you’re using an M1 Mac, be aware that running AI models like Ollama can result in increased resource usage. The M1 chip, equipped with a powerful GPU, handles these tasks efficiently, but you might still notice a spike in resource consumption. Ensure your system has adequate cooling and resources to handle these demands for optimal performance.

If you use Windows or Linux, you can achieve similar results with the Obsidian-Ollama plugin

Tags: AI for Mac, AI in Notes, AI Integration, AI Setup Guide, Free AI Solutions, Language Models, Mac Notes, Mac Productivity, macOS Tips, Notes App Enhancement, Offline AI Tools, Ollama, Ollama Installation, Productivity Hacks, Tech Tutorials
How to fix Safaricom Mpesa “Unable to get your number” error on iPhone 

Advertise with us

We can connect you to one of the world’s largest audiences of high net worth, highly educated & international professionals.

My Portfolio

I have been involved in the delevelopment or various applications and websites