Sections

Nvidia enjoys record revenue in Q3

$1.23 billion earned, profits up more than 40 per cent, but no hard data on Shield tablet sales

Nvidia earned a record $1.23 billion in revenue in the third calendar quarter, with profits rising by more than than 40 per cent.

That $1.23 billion in revenue was a 16 per cent increase over the same quarter last year. On a GAAP basis, net profit rose by 45 per cent to $173 million. Non-GAAP profit was $220 million, up 43 per cent year-on-year.

Nvidia's business is still largely based on GPUs for laptops and desktop PCs, which generated $991 million of its revenue - up 13 per cent. By comparison the company's Tegra division remains a fringe concern, with only $168 million in revenue despite rising 51 per cent year-on-year.

And the Tegra division covers a relatively wide range of different products: smartphone and tablet chips, automotive computers, infotainment and navigation systems, and Shield tablet and portable gaming devices. Indeed, Nvidia offered no specific data on the performance of its Shield tablet, saying only that it, "enjoyed solid demand during the quarter."

"Growth drivers have kicked in for us on several fronts," said CEO Jen-Hsun Huang in a statement. "High-performance computing, virtualisation and web service providers have created demand for our GPU-accelerated data-center platforms.

"Automakers are using Tegra to help reinvent the driving experience. And our new Maxwell architecture is a giant leap forward that has triggered a major upgrade cycle by PC gamers."

So far this year, Nvidia has returned $950 million to its shareholders through cash dividends and share repurchasing.

Related stories

Softbank buys Nvidia stake, worth a reported $4 billion

Deal is linked to $93 billion Vision Fund, and would make Softbank Nvidia's fourth largest shareholder

By Matthew Handrahan

Nintendo Switch boosts Nvidia revenues by up to $192m

Graphics hardware firm reports Q1 revenues of $1.94bn, aided by new console's use of Tegra processor

By James Batchelor

Latest comments (13)

Robert Mac-Donald Game Designer, Lethe Games3 years ago
Nvidia's business is still largely based on GPUs for laptops and desktop PCs
So what happens next year when Euclideon releases their 2 demo games based on their Unlimited Detail Engine, that can run games in around 30 FPS rendering unlimited polygons, WITHOUT using a video card? Are video cards nearing their end?

Some sources if you happen to be clueless about the technology:

Youtube video showing their latest advancements with the technology
Euclideon Interview 2014
0Sign inorRegisterto rate and reply
Jordi Rovira i Bonet Lead Engineer, Anticto3 years ago
@Robert Nothing will happen. Until they prove they can move something in their scenes. If they do.

Edited 1 times. Last edit by Jordi Rovira i Bonet on 7th November 2014 2:28pm

3Sign inorRegisterto rate and reply
Robert Mac-Donald Game Designer, Lethe Games3 years ago
It's difficult to doubt their claims now.

They already have a product called Geoverse being sold to governments, oil companies and more at £20.000 a license. How would they benefit by lying to the games industry about a claim they will never deliver? It only hurts their reputation when they are already selling a product.

Anyway,
Until the prove they can move something in their scenes.
They already proved that years ago, on a clip from a much older version of the engine. Here's the link at 5:20
What we don't know is the level of quality animations have right now, if they have improved at all. Euclideon said that on their next video they will show the current state of animations, so we'll have to wait a few more months for it.

Edited 1 times. Last edit by Robert Mac-Donald on 7th November 2014 1:58pm

0Sign inorRegisterto rate and reply
Show all comments (13)
Klaus Preisinger Freelance Writing 3 years ago
If Euclideon can prove that they can release a product which runs better on a CPU than on current GPUs, then nVidia will try and prove that the same Euclideon software runs better on their specialized chip than on a generalist CPU. That nVidia chip may need to be designed from the ground up, but such is their business. nVidia designs chips for special case scenarios which outperform pure CPUs at such a ridiculous rate that they cannot be ignored.
2Sign inorRegisterto rate and reply
Robert Mac-Donald Game Designer, Lethe Games3 years ago
if Euclideon can prove that they can release a product which runs better on a CPU than on current GPUs, then nVidia will try and prove that the same Euclideon software runs better on their specialized chip than on a generalist CPU
Bruce himself said on the past they would tap in the power of video cards, and that video cards would probably be used to keep pushing resolutions further and further. That was about 3 years ago, however. On one of his his most recent interviews (Link )he says jumping to the GPU would be admitting defeat
Dell said the current implementation produces 2000x1000 frames at around 32 FPS on a six-core processor. He claims there's "no reason" the technology can't be sped up using, for instance, OpenCL on a GPU, but there are "lots of software ways" to improve performance yet. Jumping straight to GPU optimization would be "admitting defeat," in his view.
If everything turns out ok, it is possible video cards will become a luxury in the future, rather than the bare essential for gaming. Personally I'd like that. Video cards are expensive, hard to get on my country (places like new egg won't ship here), and I'm posting from a laptop using a external drive with Linux because when I plug the drive on my desktop PC, the Nvidia drivers for linux crash the system. I am all for a future where computers get less expensive, and less complex with hardware-software integration
0Sign inorRegisterto rate and reply
Adam Campbell Game Production Manager, Azoomee3 years ago
he says jumping to the GPU would be admitting defeat
Robert, think about it for a second, that doesn't make any sense.

GPUs are relatively cheap, highly parallel processing architecture that even come as part of the very CPU package companies like Intel and AMD sell nowadays. Ignoring that processing capability regardless of whether or not 'unlimited detail' on a CPU is possible is silly.

As Klaus said, if the rendering approach changes, the GPU as one of the types of processor out there simply becomes another way of processing that stuff.

I think this should be about innovation and advancement, not about pointless crusades. Even with this technology, there are always going to be things a GPU can do better, even traditional CPU functions like physics/animation if you want, and should be taken advantage of as such.
If everything turns out ok, it is possible video cards will become a luxury in the future, rather than the bare essential for gaming.
It may fill you with more hope knowing that many APUs i.e. Intel Core series and AMD A series are becoming better and better at games performance, and they come as a standard on the chip. Again, even with Euclideon's technology, such power, which is beyond the CPU on its own should be taken advantage of as it effectively is what the CPU has become.

Edited 1 times. Last edit by Adam Campbell on 7th November 2014 8:35pm

2Sign inorRegisterto rate and reply
Robert Mac-Donald Game Designer, Lethe Games3 years ago
Even with this technology, there are always going to be things a GPU can do better
But the question is, will that be a necessity? If you can run games with good graphics, and most importantly, solid FPS with your non gaming average laptop, maybe a lot of people won't bother paying more for the luxury of a video card, that comes with the issues of another fan making noise, a big piece of equipment that ads to the possibilities of malfunction and overheat, and that to my knowledge would make your monthly electricity bill increase a little as well. It also forces you to buy a more expensive power supply to handle it.

In an industry where consoles are big because people don't want to bother with having to build their own pcs, the tendency is for people to only bother with video cards when absolutely required or when it does make a huge, remarkable difference in how your games run. I'd wager that even most PC gamers just want something they can plug and play. There΄s also the driver issues as well that come with a video card. I've had a laptop that could not run any mmos without crashing (yes, oddly enough the issue only happened in MMOs), and for over two years the company responsible for its drivers did not fix it. There were threads with 15+ pages on the internet on multiple forums discussing the problem, but it never got fixed to my knowledge.

As for them being cheap, that΄s certainly not the case in third world countries. A GTX 580 can be the equivalent of 1000 USD. Taking that from your monthly revenue is a lot for most people.

Edited 1 times. Last edit by Robert Mac-Donald on 8th November 2014 12:20pm

0Sign inorRegisterto rate and reply
Adam Campbell Game Production Manager, Azoomee3 years ago
This is why I specifically made a point to direct a lot of attention to APUs.

We're looking at the Intel Core series with HD /Iris graphics, AMD A series etc. Nvidia are pushing resources into the low cost, efficient market too. Many of these architecture are finding their into consoles and tablets as well as laptops and low cost desktops.

Let me make it clear, these are graphics processors that come ON BOARD the CPU. You don't need a big, heavy card these days even to play modern games on modest settings. A cheap laptop now contains a better version of the very architecture inside the Xbox One and PS4.

Even some of the 'non-gaming' laptops, which all have graphics solutions mind you, will be better than no GPU at all. I'm heavily behind these solutions and also finding ways in games technology to get incredible graphical presentation on lower cost hardware.

All I'm saying is that (Bruce Dell) negating the GPU to prove a point, if he indeed said what you said above, that doesn't actually help. It isn't 'admitting defeat' to say a GPU can texture an object better or calculate lighting more dynamically in real time, for example.
0Sign inorRegisterto rate and reply
Klaus Preisinger Freelance Writing 3 years ago
@Robert

Let's take a gaming PC. You put a 250€ i7 in there and a 250€ graphics card inside of it. You need some additional infrastructure parts, such as a powersupply, a mainboard, memory, whatever. The cost of the infrastructure is more or less fixed, even though you have lots of choice and pricepoints. But when the rubber hits the road, your performance will come from those 500€ you spent on a CPU and a graphics card. Not whether you spend 200€ or 800€ on the rest of the equipment.

You could spend 500€ on the CPU and nothing on a graphics card right now, but once you start Ryse in 4K mode, not a lot of things will happen. Splitting the 500€ is the smart thing, it results in more power for that one use of your machine which actually takes the overall processing power to the limit. Which is why the power of several different chips is combined in such a way that it benefits you the most.

Now Euklideon comes along and says "our new engine is only using CPU". Great, I want that engine to run as best it can, so this time I spend 500€ on the CPU. Graphics card no longer required. This move did not make my computer one Euro cheaper. Sure, I could go with just the 250€ CPU but there would be a dent in my performance. Be it fewer frames, less resolution, whatever. Spend less money, get less power, no way around that, Euklideon engine or not.

Even in an Euklideon world, nVidia's job is to make the argument they make now. They see you really like Euklideon engine games. They know, you are in the habit of spending 500€. All nVidia needs to say is "listen, you can buy that 500€ CPU for more frames, but tell you what. You stick with the 250€ CPU and give us the rest. In exchange, we give you a thing to plug into your computer. In combination, your 250
€ CPU and our thing will still cost you 500€, but you get better picture quality and more frames than if you just spend 500€ on the CPU."
So you buy the nVidia thing after all. Btw, it is not a graphics card, it the the nVidia Euklideon accelerator Mantis. Bioengineered living bug inside your computer farting out processing power, literally. GPUs are so 2012.

The very nature of nVidia wanting to compete for part of your budget means their proposal needs to work whether you are planning to buy a 500€ CPU, or a 100€ CPU. They will always try and make a product which entices you to split off part of your budget and give it to them, because the game you play looks better that way. Why? Because you do not need an i7 for office work and browsing porn sites. You do not even need an i3 for that. You can run a four people office on one i3 and four raspberry VNC clients these days. Outside of gaming, processing power is over and done with, only latency remains as the final barrier.
3Sign inorRegisterto rate and reply
Robert Mac-Donald Game Designer, Lethe Games3 years ago
Those are all valid points to me as well Adam and Klaus.

When Euclideon releases the demo games, if we see that they truly have potential to be implemented in every 3D game, personally I will try to migrate to the smallest, least driver dependent option available I have as far as hardware goes. I prefer convenience over graphics in the end, as with my unlucky streak with purchased PCs and laptops is legendary.

In that same website I linked above (gaming face), bruce says that
Bruce Dell: Some very big companies have licensed the right to use our technology but we are under NDA so only they can make the announcement, apart from that we have our geoverse product that a lot of railroads, governments and construction companies now use to manage multi-terabyte laser scans.
I wonder if by any chance they are gaming companies. Only 6 more months until their demo games come out, hopefully much sooner before their next video showing animations comes out, then we'll be able to judge whether this thing will truly revolutionize the video game industry. If it does, combined with VR headset technologies, we'll be living at uncanny times
0Sign inorRegisterto rate and reply
Marty Howe Director, Figurehead Studios3 years ago
who gives a f*** about GPU's and CPU's

does anyone care about making fun games anymore?
1Sign inorRegisterto rate and reply
Adam Campbell Game Production Manager, Azoomee3 years ago
who gives a f*** about GPU's and CPU's
Is the title not about the revenues of a semiconductor company?

Edited 1 times. Last edit by Adam Campbell on 10th November 2014 3:45pm

7Sign inorRegisterto rate and reply
Istvan Fabian Principal Engineer, Sony Computer Entertainment Europe3 years ago
As for Euclideon:
Ok, so it uses 6 cores for rendering the voxels at an acceptable frame rate with a medium resolution and no animation or moving objects whatsoever and no change of lighting apart from the lights that were present during scanning which is identical to having baked lights... on a high-end cpu.
So it is essentially playing back a true 3D movie where voxel colors are interpolated in 3D, which is a nice thing to do, but has nothing much to do with what games need - apart from non-interactive cutscenes.
Once you have physics, AI, objects, characters, animation which all compete for the already maxed out CPU power, you do have a big problem.
Will be nice with 16/32 core CPUs, which may happen mid-term, and a huge amount of RAM for animation.
Where the hardware (by Nvidia or anyone else) would be way better than what they have in software is the actual sorting the pixels, and interpolation coupled with low latency RAM. Dedicated hardware is still a winner, just the purpose would be slightly different.

Effectively, what they are doing is waiting for consumer technology to catch up (and hope they don't go bust until then) with the 3D movie tech they have - which was actually used in games before, just at a lot worse resolution.
1Sign inorRegisterto rate and reply

Sign in to contribute

Need an account? Register now.