Cheap CUDA cards

Does anyone know a good place to get cheap (NVIDIA) graphics cards besides eBay (to the U.S.)? I’m currently using two (pre-owned) GTX 1060’s to train Rhasspy speech models and voices, and I’d like to either get more or something better.

Thanks to the 30xx shortages, it seems like all the good CUDA cards (1080 ti, 2080 ti) are holding their values, even used. I know there are cloud services like co-lab, but I’d like to keep things local :slight_smile:

Have you looked at the Coral USB adapter?
Not sure if that’s in the same ballpark as CUDA.

Update… It might be after looking at this writeup.

1 Like

This sounds like a great way to donate to your efforts! Want to DM me a link to something you’d like? I’m happy to help!

1 Like

make a amazon wishlist! or a donation option :slight_smile:

2 Likes

It looks good for inference, but I need CUDA for training new models. I do wonder if we could get the TTS voices to run on one of the Coral adapters though…

It would be, thank you for the offer :slight_smile: Looking at even the used prices, though, it’s a bit much for a donation. The GTX 1060 6GB (what I currently have) is around $150 on eBay pre-owned!

Here are the cards I’ve been looking at:

Name CUDA cores Memory Cost (used)
GTX 1060 1280 6 GB $150
GTX 1080 Ti 3584 11 GB $350
RTX 2070 Super 2560 8 GB $450
RTX 2080 Ti 4352 11 GB $950

These seem to be the best performance-per-dollar according to Tim Dettmers. I can’t believe the cards from two generations back are still so expensive!

Love the name and profile pic @kahless :smiley:

You guys are great, but I’m really hesitant to accept any money just in case there’s some kind of conflict with my job. But if someone has a cheap card they’re willing to sell, that seems fine :wink:

Love the name and profile pic @kahless :smiley:

You guys are great, but I’m really hesitant to accept any money just in case there’s some kind of conflict with my job. But if someone has a cheap card they’re willing to sell, that seems fine :wink:

:slight_smile:

1 Like

these used cards are still holding their value because are still profitable on crypto-mining

Ah, this makes sense. I was excited when I read that the P106-100 mining cards could be used for machine learning, only to find out that they’re now more expensive than the (equivalent) 1060 6GB’s which actually have display output!

I know these commodity cards wouldn’t exist if not for the market, but it’s frustrating having to compete with gamers and miners (and the scalpers among them) for hardware. NVIDIA’s “enterprise” cards are just laughably expensive, so until AMD gets RoCM up to a usable level, I’ll have to outbid kids wanting more FPS in Fortnite for cards :confused:

Did you ever get a card? I may have a 1070 founders edition I’m willing to part with soon.

1 Like

I managed to get a used 2080 Ti for under $700 off eBay. But I’m always looking for more cards to add to my little GPU “fleet” (currently the 2080 Ti and 3x1060 6GB).

I have training running 24/7 on these cards, and plenty of other things in the queue. So I’d be interested if you’re willing to sell it :slight_smile:

It will probably be first of the year before I can ship it. It’s in my Son’s PC at the moment. He’s getting my old 2070 since I just snagged a 3070… and assuming no major shipping delays and I get it before Christmas. Trickle down PC upgrade :smiley:

How old of a card is usable (800 or 900 series)?

Thanks! Let me know whenever and we can exchange info :slight_smile: I’m sure your son will enjoy gaming on the 2070.

Congrats on snagging an Ampere card. I’m drooling over the 3090, but it’s both pricey and out of stock everywhere. But I’ve entered a YouTube giveaway, so we’ll see where that goes :laughing:

The 1060 6GB seems to be as low as I can go with the current MozillaTTS models.

With the recommended batch sizes, I use about 5.5 GB of the RAM. The 8 GB on the 1070 Ti would be great for training the second half of the model (a vocoder), since I have to cut those batch sizes down for the 1060 or wait for my 2080 Ti to become available :+1:

Ok, card is out and boxed up and I should be able to get it shipped sometime this week. I’m not sure how to DM in here.
Let me know what you feel is a good price for the 1070 and your shipping address… if you can DM me. Otherwise, my gmail address is the same as my username in here :wink:
I don’t mind shipping it and then you can just send a check when you get it.
Note: It’s not a “Ti” version it’s the Nvidia GTX1070 Founders Edition. I never overclocked it, but I believe the founders edition was supposed to be.

1 Like

The 780 (Cuda 7.5) is best bang for buck for me if you just want cuda cores but if you also want to test cutting edge inference then RTX 2060 is the minimum card that contains tensor cores.

The whole GDDR6 availability sham currently means the GPU is a sellers market that will prob crash some time in the New Year.
AMD & Nvidia will trade blows when available and I am the same as been scouring UK ebay for a 1070ti due to the 8gb but also when it comes to cutting edge realtime inference wondering if I just go RTX 3060ti as the cheapest way I can dip my toes.
(I have a feeling RTX 3060 might be minus tensor cores but waiting to see)

What nvidia just demonstrated with RTX Voice centralised multi user inference sharing has to be a consideration in modern home AI.
You get tensor cores and much what would be a ‘fat chance’ could actually run locally.

Just reading this discussion as I was searching for references to the Coral.

I currently use one for Object detection - but it has cycles to spare and I recently saw the demo model for Coral speech recognition. I already use Rhasspy and wondered if / how much it had been looked into. Since there is only a demo model - and from what I read… almost sounds like each user would need to build a model for their sentences…

Anyway - thought it was a cool idea… is this still something that is a possibility in the future? Would try to help if I can…

DeadEnd

On a Pi its USB only as the weird manner of VC4/6 being the bootloader is still 32bit whilst the pcie interfaces of the coral cards are 64bit only.

I have a coral mini pcie in a Nuc but never got round to playing more than the parrot detector demo.
If the model is tensorflow the tools to partition and run on coral are there, not sure about other engines.

The GPU crazy prices look like they will continue for what might be years than months by some new reports, there are some headless mining cards that do go for less.