checkAd

     171  0 Kommentare NVIDIA Smashes Performance Records on AI Inference

    NVIDIA Extends Lead on MLPerf Benchmark with A100 Delivering up to 237x Faster AI Inference Than CPUs, Enabling Businesses to Move AI from Research to Production

    SANTA CLARA, Calif., Oct. 21, 2020 (GLOBE NEWSWIRE) -- NVIDIA today announced its AI computing platform has again smashed performance records in the latest round of MLPerf, extending its lead on the industry’s only independent benchmark measuring AI performance of hardware, software and services.

    Anzeige 
    Handeln Sie Ihre Einschätzung zu Nvidia Corporation!
    Long
    742,22€
    Basispreis
    4,48
    Ask
    × 14,95
    Hebel
    Short
    922,27€
    Basispreis
    0,53
    Ask
    × 14,86
    Hebel
    Präsentiert von

    Den Basisprospekt sowie die Endgültigen Bedingungen und die Basisinformationsblätter erhalten Sie bei Klick auf das Disclaimer Dokument. Beachten Sie auch die weiteren Hinweise zu dieser Werbung.

    NVIDIA won every test across all six application areas for data center and edge computing systems in the second version of MLPerf Inference. The tests expand beyond the original two for computer vision to include four covering the fastest-growing areas in AI: recommendation systems, natural language understanding, speech recognition and medical imaging.

    Organizations across a wide range of industries are already tapping into the NVIDIA A100 Tensor Core GPU’s exceptional inference performance to take AI from their research groups into daily operations. Financial institutions are using conversational AI to answer customer questions faster; retailers are using AI to keep shelves stocked; and healthcare providers are using AI to analyze millions of medical images to more accurately identify disease and help save lives.

    “We’re at a tipping point as every industry seeks better ways to apply AI to offer new services and grow their business,” said Ian Buck, general manager and vice president of Accelerated Computing at NVIDIA. “The work we’ve done to achieve these results on MLPerf gives companies a new level of AI performance to improve our everyday lives.”

    The latest MLPerf results come as NVIDIA’s footprint for AI inference has grown dramatically. Five years ago, only a handful of leading high-tech companies used GPUs for inference. Now, with NVIDIA’s AI platform available through every major cloud and data center infrastructure provider, companies representing a wide array of industries are using its AI inference platform to improve their business operations and offer additional services.

    Additionally, for the first time, NVIDIA GPUs now offer more AI inference capacity in the public cloud than CPUs. Total cloud AI inference compute capacity on NVIDIA GPUs has been growing roughly 10x every two years.

    NVIDIA Takes AI Inference to New Heights

    NVIDIA and its partners submitted their MLPerf 0.7 results using NVIDIA’s acceleration platform, which includes NVIDIA data center GPUs, edge AI accelerators and NVIDIA optimized software. 

    Seite 1 von 3


    Diskutieren Sie über die enthaltenen Werte


    globenewswire
    0 Follower
    Autor folgen

    Verfasst von globenewswire
    NVIDIA Smashes Performance Records on AI Inference NVIDIA Extends Lead on MLPerf Benchmark with A100 Delivering up to 237x Faster AI Inference Than CPUs, Enabling Businesses to Move AI from Research to Production SANTA CLARA, Calif., Oct. 21, 2020 (GLOBE NEWSWIRE) - NVIDIA today announced its AI …

    Schreibe Deinen Kommentar

    Disclaimer