#349 Run LLMs Locally on Fedora with Ollama and OpenWebUI
Closed: published 3 months ago by rlengland. Opened 3 months ago by glb.

AI and ML are currently the buzzwords and a lot of people want to get started running LLMs locally. This is going to be a starter for a series of material for people who want to use Fedora as a platform for day to day use of AI and related technologies.

Intended publish date 07-01-2025

https://discussion.fedoraproject.org/t/pitch-run-llms-locally-on-fedora-with-ollama-and-openwebui/141475


Metadata Update from @glb:
- Issue assigned to sumantrom
- Issue tagged with: article, needs-image, needs-series

3 months ago

Metadata Update from @rlengland:
- Custom field publish adjusted to 2025-01-07

3 months ago

Here's the draft, I have taken the liberty to craft a cover image ... please review! https://fedoramagazine.org/wp-admin/post.php?post=41465&action=edit

Metadata Update from @rlengland:
- Custom field preview-link adjusted to https://fedoramagazine.org/?p=41465&preview=true

3 months ago

I've adjusted the size of the image to the magazine standard 1890x800 Px and added text overlay.

Metadata Update from @rlengland:
- Custom field image-editor adjusted to sumantrom

3 months ago

Metadata Update from @rlengland:
- Issue untagged with: needs-image

3 months ago

@glb I note that this article is flagged with "needs-series". Can you indicate which other article(s) should be part of the series?

The other article in pipeline would be

Image Generation with Stable Diffusion on Fedora
Finetuning and training Llama with unsloth
Intel Openvino x Fedora

I will add more as we move ahead

Metadata Update from @rlengland:
- Custom field editor adjusted to rlengland

3 months ago

Issue status updated to: Closed (was: Open)
Issue close_status updated to: published

3 months ago

Log in to comment on this ticket.

Metadata
Boards 1
articles Status: published