AMD CEO, Dr. Lisa Su, believes that artificial intelligence (AI) will play a critical role in the future of the industry

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 27, 2023.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    46,372
    Likes Received:
    14,219
    GPU:
    AMD | NVIDIA
    fantaskarsef likes this.
  2. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    14,625
    Likes Received:
    8,141
    GPU:
    2080Ti @h2o
    Buzzword bingo.
     
    schmidtbag, toyo, mackintosh and 4 others like this.
  3. Jenda

    Jenda Member

    Messages:
    30
    Likes Received:
    12
    GPU:
    6700XT 12GB
    Poor Lisa is last on Earth who realized that. NV working on AI XXX years already.
    What a statement /clap
     
    barbacot likes this.
  4. Maddness

    Maddness Ancient Guru

    Messages:
    2,337
    Likes Received:
    1,594
    GPU:
    3080 Aorus Xtreme
    I think AI is going to play a huge part in everything moving forward. Eventually touching every industry. Nvidia saw this and are cashing in bigtime.
     
    barbacot and Embra like this.

  5. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    12,625
    Likes Received:
    6,308
    GPU:
    ASUS RX 470 Strix
    Gotta grab onto something after the mining crash.
     
  6. barbacot

    barbacot Master Guru

    Messages:
    898
    Likes Received:
    863
    GPU:
    MSI 4090 SuprimX
    Good luck catching Nvidia and Intel now....:p

    Intel has AGILEX FPGA already at seventh generation, Nvidia has... well, everything, now finally AMD CEO had an epiphany - give her a raise!
     
    Last edited: Mar 27, 2023
  7. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    6,903
    Likes Received:
    8,535
    GPU:
    RX 6800 XT
    You do realize AMD bough Xilinx? A company that makes FPGAs and APACs.
    AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year.

    AMD might be behind the curve in the consumer market with AI. But it's much more competitive on the professional space.
     
    reix2x and moo100times like this.
  8. Fediuld

    Fediuld Master Guru

    Messages:
    769
    Likes Received:
    450
    GPU:
    AMD 5700XT AE
    Butlerian Jihad NOW.

    Wake up people.
     
  9. pharma

    pharma Ancient Guru

    Messages:
    2,265
    Likes Received:
    954
    GPU:
    Asus Strix GTX 1080
    Even in the professional space AMD (Xilinx included) it is behind with different acceleration hardware and specifically software used to enhance AI algorithms and processes. The "AL Accelerator cores" is similar to tensor cores but hardware acceleration involves different components and is not a direct substitute in AI systems using rival technologies.

    For the past few years AMD stated they will not be implementing any AI functionality in consumer market, only professional. I won't be surprised if this flip-flopped in the future.
     
  10. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    6,903
    Likes Received:
    8,535
    GPU:
    RX 6800 XT
    It has already flipped. AMD already stated they will add Tensor units to RDNA4. But this is for late 2024, a very long time in the computer space.
     

  11. Picolete

    Picolete Master Guru

    Messages:
    473
    Likes Received:
    247
    GPU:
    R9 290 Sapphire Tri-x
    What i want to see is AI used on climate patern recognition, so we can finally get a more accurate weather forecast
     
  12. barbacot

    barbacot Master Guru

    Messages:
    898
    Likes Received:
    863
    GPU:
    MSI 4090 SuprimX
    Buying is not the same as using it (at least for good purpose...)

    I don't know about professional space - I thought that Nvidia has this covered :p but in the scientific field if we want to build a deep neural network the solutions are always the same: Nvidia GPU or Intel FPGA. Also, it's not just the hardware but software support and support in general which AMD is seriously lacking - they remind me of Microsoft in the 2000 when they lost the internet race when they didn't pay too much attention to facebook, google, amazon, etc Even today they are not fully recovered...
    And your answer said it all: "AMD also has CDNA, which has Tensor cores for Machine learning. CDNA3 releases this year." - late to the party as always (Nvidia uses tensor cores since 2017)....but better late than never no?
    Also who cares about AMD tensor cores without proper software??? - Nvidia has CUDA everywhere - and it's really simple to use and can be ported in C# or Python for example...

    Again - give her a raise!:p
     
  13. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    6,903
    Likes Received:
    8,535
    GPU:
    RX 6800 XT
    CDNA has tensor units since the first iteration.
    And Xilinx was producing and selling FPGAs and APACs before the merger. And are doing it after the merger.
    AMD might be a bit behind the game compared to NVidia, but not as much as you think.
     
  14. H83

    H83 Ancient Guru

    Messages:
    4,826
    Likes Received:
    2,331
    GPU:
    XFX Black 6950XT

    Unfortunetly, AMD doesn`t have the resources to bet in every possible market, so they have to lag behind in some of them and just leave others completely unattended.
     
    barbacot likes this.
  15. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,432
    Likes Received:
    3,843
    GPU:
    HIS R9 290
    One of the only things that has truly irritated me about AMD is how little effort they put into GPU compute. They were practically sitting on a goldmine with TeraScale2. GCN had lots of potential, and yet, they practically abandoned both. They didn't really have to optimize their drivers that much because they already had tremendous performance, all they had to do was improve documentation and work directly with developers like Nvidia did.
    I don't like how Nvidia monopolized the GPGPU market but they honestly deserved it - CUDA is objectively a better product and Nvidia put in a lot of time, money, and resources to make it that way. AMD kinda just sit back idly with the attitude of "we're an alternative if you don't want CUDA for some reason" but now that Intel is offering a legit non-CUDA competitor, AMD is now starting to "care". Intel is doing what AMD wouldn't for so many years: actually try to convince people to switch from CUDA. Ironically, this will somewhat help AMD, but it's Intel who is rightfully going to get the credit.

    I believe we'd have a much more competitive gaming GPU market if AMD actually tried to compete 10 years ago. They lost billions because they basically just allowed Nvidia to win.
     
    barbacot, Maddness and H83 like this.

  16. geogan

    geogan Maha Guru

    Messages:
    1,190
    Likes Received:
    411
    GPU:
    4080 Gaming OC
    "Two weeks ago, Microsoft said it had bought tens of thousands of Nvidia’s AI-focused processors, the A100 GPU, in order to power the workload of OpenAI. Nvidia has sold 20,000 H100s, the successor to that chip, to Amazon for its cloud computing AWS service, and another 16,000 have been sold to Oracle."

    Those are staggering amounts of expensive GPUs being sold now to big tech companies and this is why NVidia is now reaming rewards for all the ground work it's put into this area for years.

    I wonder what Oracle is going to produce after buying that many H100s?!

    Source: https://www.theguardian.com/technol...iety-nvidia-chatbots-processing-crypto-mining
     
    AuerX likes this.
  17. Dribble

    Dribble Master Guru

    Messages:
    331
    Likes Received:
    133
    GPU:
    Geforce 1070
    The problem for AMD is much of the complexity in AI is in the software and they don't really do software. The hardware doesn't need an x86 license so anyone can make it - Google have been quite happily developing AI hardware, as have various phone manufacturers. Nvidia's success is because they wrote the software - ChatGPT is NVidia hardware and software, just customised by the ChatGPT team. Google equally have been developing AI software for a number of years, which is why they have a Bard.
    Hence while I am sure AMD could produce some great hardware, how are they going to get anyone to buy it without equally great software? It'll be exactly like happened with gpu compute - in the end it was the solution with the best software (CUDA) that won.
     
  18. Martin5000

    Martin5000 Master Guru

    Messages:
    301
    Likes Received:
    124
    GPU:
    8 gig
    AI will never be self aware just wont happen. its pure fiction to think it will ever happen.
    What people like to call AI are just organisers that move things from one place to another if a certain criteria i met. This isn't AI and never will be. Self awareness and free will walks hand in hand with the quantum realm. all man knows what to do with that is to smash particles together and measure the blasts.
     
  19. Venix

    Venix Ancient Guru

    Messages:
    3,110
    Likes Received:
    1,736
    GPU:
    Palit 1060 6gb
    Neither google or Nvidia or anyone of those players trying to create self awareness, AI as we use it now is models that you feed data tons of data all the data and keep improving , for example you can feed a neural network millions of medical history files with patients that have Parkinson and can extrapolate for early diagnosis , the applications and uses of ai as we use it revolutionized and accelerated and keep accelerating progress on almost every science field and not only ! And yes it comes with negatives too .And most likely AI is an unfortunate name for self improving algorithms.

    One of the examples https://news.mit.edu/2022/artificial-intelligence-can-detect-parkinsons-from-breathing-patterns-0822
     
    Last edited: Mar 27, 2023
    pharma and Maddness like this.
  20. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,576
    Likes Received:
    536
    GPU:
    RTX3090 GB GamingOC
    Surely it's Quantum computers that will deal with AI best. The way they can be used to crack any encryption thousands of times faster than normal CPUs matters.

    Imagine Chatgpt in 10 years from now. If you was on a phone line with this bot you might never know it is a bot which is scary and bad for peoples jobs.

    AI could eventually replace many, many jobs because of course businesses will use them and replace real people because one money and two bots don't need sleep.

    I mean we talk about Sentient beings but apart from self awareness we don't even know what sentience is. I'm not going to go into how a human brain works but what we do know is that we don't know what a consciousness is. So how would we know if a machine can become conscious or not as consciousness just means having knowledge of something.
     
    Last edited: Mar 28, 2023

Share This Page