Hacker publishes alleged Wii U CPU and GPU specs
29th Nov 2012 | 13:13
A well-known hardware hacker has published what is believed to be the processor and graphics card specs of
It is said that the Wii U processor carries a clock speed of 1.24 GHz - less than half the speed of the PS3 and Xbox 360. However, its GPU core is believed to run at 550 MHz, which is the same speed as Sony's home console and a tad faster than Microsoft's.
The Wii U processor speed has previously been described by one developer as "slow and horrible", though Nintendo has never revealed the final system specs so it remains a matter for debate.
A hacker by the name of Marcan, however, may have brought an end to the speculation by publishing specs on his personal Twitter account. The hardware enthusiast is a well-known Wii hacker who publishes his work on HackMii.com.
CVG is contacting various developers to confirm Marcan's claims.
Marcan has also suggested he is attempting to hack the new Nintendo console.
Last Tuesday, the chief technical officer at Metro: Last Light developer 4A said there would be difficulties with a Wii U port due to the console's ""horrible, slow CPU".
His statement was subsequently put into a more philosophical perspective by publishing partner THQ.
However, another developer has echoed the comments made by the chief technical officer at 4A Games.
Gustav Halling, lead designer on Battlefield 3: Armored Kill at EA studio DICE, said on Twitter that he was concerned with how the next Xbox and PlayStation will likely dwarf the Wii U hardware specs.
"This is also what I been hearing within the industry, too bad since it will shorten its life a lot when new gen starts," he said.
He continued: "GPU and RAM is nice to have shaders/textures loaded. Physics and gameplay run on CPU mostly so player count is affected etc.
"I don't actually know what makes it slow, but enough 'tech' people I trust in world are saying the same things."
He went on to claim that the Wii U "should be a great fun platform if you are a Nintendo fan the coming years and the memory and GPU part looks good!"