Tech With Tim

Python and AI tutorials YouTube channel by Tim Ruscica. Strong implementation focus — every video walks through working code, from notebook to deployment, with the assumption the viewer will run it themselves. Style is calm, concrete, no hype. Coverage in this wiki is concentrated on local AI infrastructure — fine-tuning small open-source models, getting them into Ollama via Modelfiles, and the practical Python recipes for using them.

Channels

  • YouTube: Tech With Tim — Python tutorials, AI, Ollama, fine-tuning, local LLM workflows
  • Site: techwithtim.net

Content in This Wiki

Key Ideas

  • “If you have bad data, you’re going to have a poorly fine-tuned model” — the most-important step in fine-tuning is the dataset, not the training loop. Tim spends as much video time on the JSON example format as on the trainer config.
  • Fine-tune to gain a specific capability, accept the loss of general capability — explicit acknowledgement that fine-tuned models get worse at general tasks while getting better at the specific one. Pick the trade-off deliberately.
  • Google Colab is the no-GPU path — Tim’s recommendation for users without an RTX 4080+: don’t try to fine-tune locally, use Colab’s free T4 GPU runtime. The whole pipeline runs in a notebook; the only local step is downloading the GGUF and loading it into Ollama.
  • OpenAI-compliant APIs are the unifier — across both videos in the wiki, Tim returns to the point that local model runners (Ollama, Docker Model Runner) all expose the same API shape, so the same Python client works against any of them by swapping the base URL.

See Also