@Smokeydope
@lemmy.worldMistral Small 22B just dropped today and I am blown away by how good it is. I was already impressed with Mistral NeMo 12B's abilities, so I didn't know how much better a 22B could be. It passes really tough obscure trivia that NeMo couldn't, and its reasoning abilities are even more refined.
With Mistral Small I have finally reached the plateu of what my hardware can handle for my personal usecase. I need my AI to be able to at least generate around my base reading speed. The lowest I can tolerate is 1.5~T/s lower than that is unacceptable. I really doubted that a 22B could even run on my measly Nvidia GTX 1070 8G VRRAM card and 16GB DDR4 RAM. Nemo ran at about 5.5t/s on this system, so how would Small do?
Mistral Small Q4_KM runs at 2.5T/s with 28 layers offloaded onto VRAM. As context increases that number goes to 1.7T/s. It is absolutely usable for real time conversation needs. I would like the token speed to be faster sure, and have considered going with the lowest Q4 recommended to help balance the speed a little. However, I am very happy just to have it running and actually usable in real time. Its crazy to me that such a seemingly advanced model fits on my modest hardware.
Im a little sad now though, since this is as far as I think I can go in the AI self hosting frontier without investing in a beefier card. Do I need a bigger smarter model than Mistral Small 22B? No. Hell, NeMo was serving me just fine. But now I want to know just how smart the biggest models get. I caught the AI Acquisition Syndrome!
I am a hobbyist computer and IT guy. Not professionally trained but I grew up with the technology and have been tinkering with them for years. I am still learning new things and enjoy deeping my understanding. Troubleshooting is often a great journey to discovering new insights.
Shelved in the basement was a desktop pc released in 2018. Ryzen 5 2600 6 core CPU, 24GB DDR4 RAM, and an AMD RX580. These days such specs are modest compared to the latest and greatest but still pretty good IMO. If I remember right, it was having some graphical issues probably caused by a hdmi cable or something. It was a long time ago, no idea why such a good PC ended up collecting dust. Oh well, as a silver lining this story is about giving the PC new life.
This week I began tinkering around with local AI. LLama 3.1 8b just got released; I have been having lots of fun learning with it on the laptop. Sadly my poor old thinkpad is just not meant for that kind of work. It was sloow to generate text and process information..
So remembering the 6 core desktop in the basement, the time felt right to dust off the PC and get it to do some useful computing. Unfortunately while the specs are powerful, the things wifi never worked right for some reason. I never thought much about it since the PC was situated next to a router with Ethernet as a connection. Now it needs to live significantly further away and rely solely on wifi for big file transfers.
On an internet connection where my laptops right next to it were getting hundreds of mbps download, the pc was getting 10mbps. Ive had metal cased desktops before and none of them were this bad connection wise. Something was seriously wrong bottlenecking an otherwise great setup. So at first I figured it must have been a linux driver issue or some kind of software bug. Spent hours installing the right drivers for my specific wifi card and troubleshooting via terminal. Didn't help any.
Then I figured maybe the card was bust and researched new wifi cards. I always thought wifi cards were little chips and antennas built into the motherboard. Not the case with this computer.
My first important discovery was that this computer had a huge wifi card mounted just underneath graphics card taking up its own slot in the back. This makes sense, if you want to upgrade to the newest wifi frequency in 10-20 years just pop a updated card into the slot.
My second important discovery was realizing the beastly wifi card had two little brass bits connecting out behind the PC. Threaded bits. Hey I know these, they are male coaxial bits.... For an... antenna.... facepalm
The realization hit me like a club. Oh... OH. YOOO IT NEEDS ANTENNAS, DUDE. I had been using a radio technology with either no antenna or an inbuilt one so awful it might as well be malfunctioning.
I felt like an idiot, have seen the back of that PC many times but for some reason just never noticed or thought about the coaxial bits and what they could be for. Oh well lets just order some cheap sticks and hope it helps.
So I with the cheap set of antennas in hand, I screwed them on. Honestly expected it not to do anything because its never that simple. Fired up speedtest before and after installation. Before antennas was 10mbps up and down After installing the antennas >200mbps down and >100mpbs up. Yeeeeah looks like that took care of the issue right away.
In the future ill look on the back of my big desktops and see if they could be easily upgraded with a set of antennas. The more you know!
Hello, I am trying to get some advice from experienced electricians and engineer workers on what jobs could be a good fit for my experience and skill sets. As well as advice on how to do a better job picking work that won't screw me over.
I am a nationally certified (NOCTI) Electromechanical Engineer. I got mentally/emotionally chewed up and spit out after working as a maintenance technician for a couple years as a young 'n dumb kid right out of school. I have kept my electrical skills sharp enough to wire up my own offgrid solar DC systems. I remember enough theory to do calculations and read schematics. My maintenance days have me somewhat familiar with electrical wiring, air duct systems, mechanical drives, pneumatic/hydraulic systems, PLC automation, and repairing broken parts with all manner of tools. I enjoy the feelings of satisfaction and capability that comes from successfully putting together and maintaining an efficient functioning system.
But im kind of scared to get back into the career field knowing how dangerous it can be (Ive mainly worked on 480v systems) and how little money I was paid before. On one hand I feel like I should use my highly technical skills and further a real career. However on the other hand every company i’ve ever worked for has screwed me over with promised training that never happened, severely understaffed stressed out maintenance teams who didn’t have the time or energy to spend teaching a newbie, and OSHA violations so egregious the inspectors were surely bribed.
I guess im trying to ask where I went wrong. What job paths are a better use of my skills that isn't so mentally and physically taxing? What are some red flags to look out for? What is contracting work like? Should I try to get into a union? I really don’t know if I want to get back into this career field and I don’t know if I want to commit to a 2 year apprenticeship contract.
Im kind of an environment guy who cares about clean energy and would love to be helping out the planet a little through my work sometimes I fantasize about working on solar arrays and renewable energy stuff.
Im pretty good with computers and IT, I use linux daily, can ssh into a remote server, port forward, and have set up some local services on my own network. I am a main developer of an open source project decently familiar with the basics of programming in lua and commiting with git. A lot of the older guys have appreciated my help navigating companies old poorly organized intranets for schematic scans and work orders.
I am in my mid 20s, single and from the US but willing to travel.
Im not really a political person but the one thing I do care about is pot. Which candidate is most supportive of federally legalizing or at least bumping down the schedule 4 drug status of pot.
This post was inspired by the surge in people mentioning the new Kagi Search engine on various Lemmy comments. I happen to be somewhat knowledgeable on the topic and wanted to tell everyone about some other alternative search engines available to them, as well as the difference between meta-search engines and true search engines. This guide was written with the average person in mind, I have done my best to avoid technical jargon and speak plainly in a way most should be able to understand without a background in IT.
There are many alternative search engines floating around that people use, however most of them are meta search engines. Meaning that they are a kind of search result reseller, middle men to true search engines. They query the big engines for you and aggregate their results.
Format: Meta Search Engine / Sourced True Engines (and a hyperlink to where I found that info)
Duckduckgo / Bing has some web crawling of it own but mostly relies on Bing
Ecosia / Bing + Google a portion of profit goes to tree planting
Kagi / Google, Mojeek, Yandex, Marginalia, Requires email signup, 10$/month for unlimited searches
SearXNG / Too many to list, basically all of them, configurable, Free & Open Source Software AGPL-3.0
4get / Google, Bing, Yandex, Mojeek, Marginalia, Wiby Open source software made by one person as an alternative to SearX
Qwant / Bing Relied on Bing most of its life but in 2019 started making moves to build up its own web crawlers and infrastructure putting it in a unique transitioning phase.
As you can see, the vast majority of alternative search engines rely on some combination of Google and Bing. The reason for this is that the technology which powers search engines, web-crawling and indexing, are extremely computationally heavy, non-trivial things.
Powering a search engine requires costly enterprise computers. The more popular the service (as in the more people connecting to and using it per second) the more internet bandwidth and processing power is needed. It takes a lot of money to pay for power, maintenance, and development/security. At the scales of google and Bing who serve many millions of visitors each second, huge warehouses full of specialized computers known as data centers are needed.
This is a big financial ask for most companies interested in making a profit out of the gate, they determine its worth just paying Google and Bing for access to their enormous pre-existing infrastructure without the headaches of dealing with maintenance and security risk.
True search engines are honest search engines which are powered by their own internally owned and operated web-crawlers, indexers, and everything else that goes into making a search engine under the hood. They tend to be owned by big tech companies with the financial resources to afford huge arrays of computers to process and store all that information for millions of active users each second. The last two entries are unique exceptions we will discuss later.
Bing / Owned by Microsoft
Google / Owned by Google/Alphabet
Mojeek / Owned by Mojeek .LTD
Yandex / Owned by Yandex .INC
YaCy / Free & Open Source Software GPL-2.0, powered by peer to peer technology, created by Michael Christen,
Marginalia Search / Free & Open Source Software AGPL-3.0, developed by Marginalia/ Martin Rue
You may be wondering how any service can remain free if it needs to make a profit. Well, that is where altruistic computer hobbyist come in. The internet allows for knowledgeable tech savvy individuals to host their own public services on their own hardware capable of serving many thousands of visitors per second.
The financially well off hobbyist eats the very small hosting cost out of pocket. A thousand hobbyist running the same service all over the world allows the load to be distributed evenly and for people to choose the closest instances geographically for fastest connection speed. Users of these free public services are encouraged to donate directly to the individual operators if they can.
An important take away is that services don't need to make a profit if they aren't a product to a business. Sometimes people are happy to sacrifice a bit of their own resources for the betterment of thousands of others.
Companies that live and die by profit margins have to concern themselves with the choice of owning their own massive computer infrastructures or renting lots of access to someone elses. You and I just have to pay a few extra cents on an electric bill that month for a spare computer sitting in the basement running a public service + some time investment to get it all set up.
As Lemmy users, you should at least vaguely understand the power of a decentralized service spread out among many individually operated/maintained instances that can cooperate with each other. The benefit of spreading users across multiple instances helps prevent any one of them from exceeding the free/cheap allotment of API calls in the case of meta-search engines like SearXNG or being rate limited like 3rd party YouTube scrapers such as Invidious and Piped.
In the case of YaCy decentralization is also federated, all individual YaCy instances communicate with each other through peer-to-peer technology to act as one big collective web crawler and indexer.
I love SearXNG. I use it every day. So its the engine I want to impress on you the most. SearX/SearXNG is a free and open source, highly customizable, and self-hostable meta search engine. SearX instances act as a middle man, they query other search engines for you, stripping all their spyware ad crap and never having your connection touch their servers.
Here is a list of all public SearX instances, I personally prefer to use paulgo.io All SearX instances are configured different to index different engines. If one doesn't seem to give good results try a few others.
Did I mention it has bangs like DuckDuckGo? If you really need Google like for maps and business info just use !!g in the query.
Here is Marginalia Search a completely novel search engine written and hosted by one dude that aims to prioritize indexing lighter websites little to no JavaScript as these tend to be personal websites and homepages that have poor Search Engine Optimization (SEO) score which means the big search engines won't index them well. If you remember the internet of the early 2000s and want a nostalgia trip this ones for you. Its also open source and self-hostable.
Finally, YaCy is another completely novel search engine that uses peer-to-peer technology to power a big web-crawler which prioritizes indexes based off user queries and feedback. Everyone can download YaCy and devote a bit of their computing power to both run their own local instance and help out a collective search engine. Companies can also download YaCy and use it to index their private intranets.
They have a public instance available through a web portal. To be upfront, YaCy is not a great search engine for what most people usually want, which is quick and relevant information within the first few clicks. But, it is an interesting use of technology and what a true honest-to-god community-operated search engine looks like untainted by SEO scores or corporate money-making shenanigans.
I personally trust some FOSS loving sysadmin that host social services for free out of altruism, who also accepts hosting donations, whos server is located on the other side of the planet, with my query info over Google/Alphabet any day. I have had several communications with Marginalia over several years now through the gemini protocol and small web, they are more than happy to talk over email. have a human conversation with your search engine provider thats just a knowledgeable every day Joe who genuinely believes in the project and freely dedicates their resources to it. Consider sending some cash their way to help with upkeep if you like the services they provide.
Of course you have to trust the service provider with your information, and that their systems are secure and maintained. Trust is a big concern with every engine you use, because while they can promise to not log anything or sell your info for profit, they often provide no way of proving those claims to be true beyond 'just trust me bro'. The one thing I really liked about Kagi was that they went through a public security audit by an outside company that specializes in hacking your system to find vulnerabilities. They got a great result and shared it publically.
The other concern is that there is no way to be sure companies won't just change their policies slowly over time to creep in advertisements and other things they once set out to reject once they lure in a big enough user base and the greed for ever increasing profit margins to appease shareholders starts kicking in. Companies have been shown again and again to employ this slow-boiling-frog practice, beware.
Still, If you are absolutely concerned with privacy and knowledgeable with computers then self hosting FOSS software from your own instance is the best option to maintain control of your data.
I hope this has been informative to those who believe theres only a few options to pick from, and that you find something which works for you. During this difficult time when companies and advertisers are trying their hardest to squeeze us dry and reduce our basic human rights, we need to find ways to push back. To say no to subscriptions and ads and convenient services that don't treat us right. The internet started as something made by everyday people, to connect with each-other and exchange ideas. For fun and whimsy and enjoyment. Lets do our best to keep it that way.
From what I can gather this conflict as been going on a long time and the Hamas group has existed for a while too. Why are all the news cycles suddenly focusing on this the past few weeks?
I am doing research on best practices for my lithium batteries and lifepo4 powerstation. There's some conflicting opinions and variation for cycle numbers.
Will leaving my things plugged in at 100% hurt it more than constantly unplugging at 80% and replugging at 20%?