When AI first hit the scene in its present type, I used to be lifeless set in opposition to it due to the generative nature of what was being offered to the general public. I thought of any shortcut to creating artwork to be offensive to the craft.
However then I spotted I might use AI for one thing that conventional looking was beginning to fail at: Analysis.
Additionally: Claude AI can do your analysis and deal with your emails now – right here's how
With either side of my writing profession (fiction and nonfiction), I’ve to do fairly a little bit of analysis, and Google was changing into a hindrance to that course of. As an alternative of being fed useful data, I used to be inundated with advertisements, sponsored content material, and its personal AI-based solutions (which had been hardly ever useful).
I first kicked the tires with Opera's Aria, which confirmed me that AI might really be useful. On the similar time, I spotted that AI additionally needed to be supervised as a result of it could possibly be incorrect as simply because it could possibly be proper.
I additionally discovered one other useful factor about AI in that it may lead me down some enjoyable rabbit holes, the place I’d uncover one thing actually cool to research. Finally, that journey led me to 2 AI instruments, each of which could possibly be put in and used on Linux free of charge.
These two instruments have helped me get extra finished every day.
1. Ollama/Msty
Ollama is an open-source AI instrument. Its open-source nature is without doubt one of the main causes I used to be drawn to it as a result of I do know builders world wide can vet its code, and to this point, nobody has come out to say they've found something untoward within the code.
On high of the open-source nature of Ollama, it's simply straightforward to put in and use. And the truth that you possibly can obtain and use a number of totally different LLMs is a little bit of scrumptious icing on an already candy cake. I can use Cogito, Gemma 3, DeepSeek R1, Llama 3.3, Llama 3.2, Phi 4, QwQ, and lots of extra.
Additionally: How I feed my recordsdata to an area AI for higher, extra related responses
However the primary cause I favor Ollama over another AI instrument is that it may be used regionally, which implies my queries aren't accessible by a 3rd occasion. I like that degree of privateness.
However how does Ollama assist me get issues finished? First, there's the prompts library, which provides you entry to a number of fast prompts and even lets you create customized prompts. One immediate I usually kind is "Do a deep dive into the next subject and ensure to discover any related facet matters:". As an alternative of all the time having to kind that immediate, I can create a fast immediate for it, so all I’ve to kind is the subject material. On high of that, I don't have to recollect to immediate Ollama to discover related matters.
Additionally: The best way to run DeepSeek AI regionally to guard your privateness – 2 straightforward methods
I create that fast immediate inside the library so I can simply name upon it at any time when I would like. This protects me time and ensures I all the time get the immediate proper each time. I don't have to consider what the immediate must say, and I could make the immediate as straightforward or difficult as I would like.
Creating fast prompts in Msty is a sure-fire strategy to make your day by day work a bit extra environment friendly.
The Prompts Library could be very useful, particularly when I’ve extra complicated prompts that I recurrently kind.
Additionally: I attempted Sanctum's native AI app, and it's precisely what I wanted to maintain my knowledge personal
Subsequent, there are the information stacks, which permit me so as to add paperwork of my very own (which all the time stay native) so the LLM I've chosen can use that data as a supply. Let's say I've written a number of articles on a single topic and wish to use their mixed data to reply some questions. I might return via and browse every part in that collection, or I might add them to a information stack after which ask my query(s). Ollama will search via each doc added to the stack and use that data for its response.
It's actually useful.
2. Perplexity
There's additionally a desktop app out there for Perplexity. The desktop app is just about the identical as utilizing Perplexity.ai by way of your browser, however I do discover it a bit extra environment friendly to make use of.
There are two most important options that assist me with my day by day duties: Search and Analysis.
Additionally: How I made Perplexity AI the default search engine in my browser (and why you need to too)
If I wish to do an ordinary search with Perplexity, I click on the Search button, kind my question, and hit Enter on my keyboard. If, then again, I would like a deeper dive right into a topic, I hit Analysis and sort my question.
One factor it’s important to know in regards to the Analysis possibility is that it really does a deep dive and might take as much as half-hour to ship your outcomes. However when you must actually get right into a topic, that characteristic is a should. The cool factor in regards to the Analysis is you possibly can click on Duties, and whereas it does its factor, it shows the sources used for the deep dive.
Watching Perplexity do its factor will be fascinating.
One factor to bear in mind with Analysis is that the free model limits you to the variety of queries you possibly can run per day. You may improve to the Skilled plan for limitless free searches and 300+ Professional searches per day. The Skilled plan is $20 per thirty days.
One other very useful characteristic in Perplexity is Areas. With this characteristic, I can create customized areas for various matters. I can then change areas, run a question, and know that the question might be remoted to that house, which means once I wish to recall that question, I solely have to modify to the house and discover it. That makes it a lot simpler to maintain observe of earlier queries with out having to scour via a protracted record.
Additionally: I attempted Perplexity's assistant, and just one factor stops it from being my default telephone AI
Between these two AI instruments on my Linux desktop, I’m able to get way more finished every day. I might extremely suggest you give one (or each) of those a strive.
Get the morning's high tales in your inbox every day with our Tech Today newsletter.