Created Sunday, March 2, 2025
In late 2023 Nintendo posted a tweet that we had anticipated for years:
As of early April 2024, online play and other functionality that uses online communication will end service for Nintendo 3DS and Wii U software. Thank you very much for your continued support of our products.
— Nintendo of America (@NintendoAmerica) October 4, 2023
Find out more: https://t.co/nOyzBImHCE
This revelation set a lot of things in motion. Early on, some notable archival projects were started with plenty of time to prepare. Some of these were run by developers with experience in particular games using NintendoClients, like Super Mario Maker levels. Some others requested the community submit StreetPass data from modded consoles. Most projects were involved with the Pretendo group, who are the most knowledgeable group about Wii U online reverse engineering.
I was tipped off to the impending shutdown of online service for every Wii U and 3DS game, including user created data, and that reminded me of my prior projects to archive Mario Maker 2 data and parse and display it. I realized there was a likely opportunity to use my skills and there was a good chance a lot of games were not popular enough to be receiving enough attention to be archived. So around 2 months before the shutdown I began talks with the Pretendo team to archive every game, indiscriminately and exhaustively.
This blogpost is about my effort to save every Wii U and 3DS game from the brink.
If you’ve been following my projects, you would know I usually use on-console homebrew to conduct exploratory research, then switch my efforts to automating via NintendoClients. This project was uniquely different:
I don’t own a Wii U?! I do not own any Wii U games?!
Unlike the Nintendo Switch there are no application tokens necessary to connect to each games' online services individually. You can get full access to any online services with a free Nintendo Network account. Once you have Nintendo Network credentials and specify some additional data (title ID, title version, NEX protocol version and a hardcoded 8 character hex string called the access key) you can connect to the services exposed by the game. I obtained some credentials and was off to the races, no games required.
But I did need a list of games! Kinnay, who designed NintendoClients, also brute forced a list of all games on the WiiU by incrementing the title ID by 0x100
and requesting game metadata from the eshop servers. Importantly for me, he did generate a list of all games with online support, including IP addresses and ports, but it isn’t that simple. Initially, that list was quite short, including only the games confirmed with a MITM proxy, but I thought there could be some missing games after talking with Pretendo.
1TITLE_ID = "0005..." 2TITLE_VERSION = 8377 3NEX_VERSION = [3, 10, 0] # 3.10.0 4ACCESS_KEY = "..." 5 6nas = nnas.NNASClient() 7nas.set_device(DEVICE_ID, SERIAL_NUMBER, SYSTEM_VERSION) 8nas.set_title(int(TITLE_ID, 16), TITLE_VERSION) 9nas.set_locale(REGION_ID, COUNTRY_NAME, LANGUAGE) 10 11access_token = await nas.login(USERNAME, PASSWORD) 12 13# Generally the last 8 hex digits of the Japanese version of the game 14nex_token = await nas.get_nex_token(access_token.token, int(TITLE_ID[8:], 16)) 15 16nex_version = ( 17 game["nex"][0] * 10000 18 + game["nex"][1] * 100 19 + game["nex"][2] 20) 21 22s = settings.default() 23s.configure(ACCESS_KEY, nex_version) 24 25async with backend.connect(s, nex_token.host, nex_token.port) as be: 26 async with be.login(str(nex_token.pid), nex_token.password, None) as client: 27 # Websocket connection started, onward! 28 store = datastore.DataStoreClient(client)
As it happens, there were. There were 2 key pieces of data that were not present in wiiu.json
from Kinnay. The first is the title version, which I was quickly able to fix with the IDBE archive collected by Pretendo. The second bit was significantly harder and led me into uncharted territory on the WiiU…
Unlike title versions access keys are not obtainable from any eshop endpoint. These strings are generally identified by reverse engineering the binaries of each game and looking for a specific known pattern. This pattern was made pretty clear by the keys published by Kinnay:
7b9b09cb 24e0a63b 59d539a9 7fcf384a ...
The access key is just 8 lowercase hex digits, so a 32 bit number. So rather than bulk obtaining game binaries the approach that made sense to me was brute forcing.
Brute forcing over the network would have been prohibitively time consuming (\(2^{32}\) seconds is 136 years!) but I could adopt a strategy used by advanced password recovery tools like Hashcat. Those tools, instead of trying random strings on some endpoint, apply some operation (usually concatenation with a salt and then hashing) and attempt to match against some known hash, allowing for local testing that could be massively parallelized. The operation I chose was similar: upon connection send a SYN packet and store the encrypted SYN packet sent back:
1# Firstly, obtain one SYN packet 2syn_packet = SynPacket() 3syn_packet_lock = threading.Lock() 4syn_packet_lock.acquire() 5 6# WiiU is UDP 7async with udp.connect(nex_token.host, nex_token.port) as socket: 8 async with util.create_task_group() as group: 9 transport = prudp.PRUDPClientTransport(s, socket, group) 10 11 async def process_incoming(): 12 while True: 13 data = await transport.socket.recv() 14 15 with util.catch(Exception): 16 packets = transport.packet_encoder.decode(data) 17 for packet in packets: 18 if packet.type == prudp.TYPE_SYN: 19 syn_packet.packet = packet 20 syn_packet.syn_packet_options = ( 21 transport.packet_encoder.encode_options( 22 packet 23 ) 24 ) 25 syn_packet.syn_packet_header = ( 26 transport.packet_encoder.encode_header( 27 packet, 28 len(syn_packet.syn_packet_options), 29 ) 30 ) 31 syn_packet.syn_packet_payload = packet.payload 32 syn_packet.syn_packet_signature = ( 33 packet.signature 34 ) 35 else: 36 await transport.process_packet(packet) 37 38 transport.group.start_soon(process_incoming) 39 40 client = prudp.PRUDPClient(s, transport, s["prudp.version"]) 41 with transport.ports.bind(client, type=10) as local_port: 42 client.bind(socket.local_address(), local_port, 10) 43 client.connect(socket.remote_address(), 1, 10) 44 45 async with client: 46 client.scheduler = scheduler.Scheduler(group) 47 client.scheduler.start() 48 49 client.resend_timeout = 0.05 50 client.resend_limit = 0 51 52 try: 53 await client.send_syn() 54 await client.handshake_event.wait() 55 56 if client.state == prudp.STATE_CONNECTED: 57 None 58 59 syn_packet_lock.release() 60 except RuntimeError: 61 None 62 63 syn_packet_lock.acquire() 64 syn_packet_lock.release()
Then apply the same operation used internally to both decrypt the packet and determine its authenticity:
1def test_access_key(access_key, syn_packet): 2 key = hashlib.md5(access_key.encode()).digest() 3 mac = hmac.new(key, digestmod=hashlib.md5) 4 mac.update(syn_packet.syn_packet_header[4:]) 5 mac.update(b"") 6 mac.update(struct.pack("<I", sum(access_key.encode()))) 7 mac.update(b"") 8 mac.update(syn_packet.syn_packet_options) 9 mac.update(syn_packet.syn_packet_payload) 10 11 return mac.digest() == syn_packet.syn_packet_signature
Note the similarity to the procedure used in NintendoClients for encoding a packet
1class PRUDPMessageV1: 2 def __init__(self, settings): 3 self.access_key = settings["prudp.access_key"].encode() 4 5 def signature_size(self): return 16 6 7 def calc_packet_signature(self, packet, session_key, connection_signature): 8 options = self.encode_options(packet) 9 header = self.encode_header(packet, len(options)) 10 11 key = hashlib.md5(self.access_key).digest() 12 mac = hmac.new(key, digestmod=hashlib.md5) 13 mac.update(header[4:]) 14 mac.update(session_key) 15 mac.update(struct.pack("<I", sum(self.access_key))) 16 mac.update(connection_signature) 17 mac.update(options) 18 mac.update(packet.payload) 19 return mac.digest()
Under the birthday bound I expected to check only \(2^{31}\) access keys before finding the correct one, but this was not a big enough improvement. Before running this code in multiple processes it would take more than 1.5 hours per check. With 746 games advertising themselves as NEX compatible this process would take more than a month, which was unacceptable. So in order to finish on time I then ran this check in 16 processes simultaneously on my Macbook M1, which is the machine I own with the best CPU. I chose to split the \(2^{32}\) domain space evenly between each process so they wouldn’t waste time syncing with the other processes.
1def range_test_access_key( 2 i, syn_packet, host, port, title_id, found_key, done_flag 3): 4 interval = int(pow(2, 32) / NUM_PROCESSES) 5 6 for number_key_base in range(interval): 7 number_key = number_key_base + i * interval 8 9 if number_key_base % 1000000 == 0: 10 # Check occasionally if it's time to terminate 11 if done_flag.value: 12 return 13 14 string_key = hex(number_key)[2:].rjust(8, "0") 15 if test_access_key(string_key, syn_packet): 16 entry = "%s, %s, %s, %s, (%d)" % ( 17 hex(title_id)[2:].upper().rjust(16, "0"), 18 hex(title_id)[-8:].upper(), 19 string_key, 20 host, 21 port, 22 ) 23 24 list_file = open(LIST_PATH, "a") 25 list_file.write("%s\n" % entry) 26 list_file.flush() 27 list_file.close() 28 29 print(entry) 30 31 found_key.value = ("%s" % string_key).encode() 32 33 done_flag.value = True 34 break 35 36found_key_lock = Lock() 37found_key = Array("c", 10, lock=found_key_lock) 38 39processes = [ 40 Process( 41 target=range_test_access_key, 42 args=( 43 i, 44 syn_packet, 45 nex_token.host, 46 nex_token.port, 47 int(TITLE_ID, 16), 48 found_key, 49 ), 50 ) 51 for i in range(NUM_PROCESSES) 52] 53 54for p in processes: 55 p.start() 56for p in processes: 57 p.join() 58 59if found_key.value: 60 possible_access_keys.add(found_key.value.decode("utf-8"))
With this code I managed to bring down the time per key to 8 minutes, which was easily quick enough when leaving my M1 on overnight (and bringing my laptop to college with the process still running!). This was one of the few times I managed to push my M1 to 100% CPU usage, which was a good sign to me I was writing code that got the most out of my hardware.
After generating this list over 4 days, I was surprised to find that nearly all of the access keys I had cracked had 0 results on Google and Github. I was not just the first person to scrape these games, I was likely the first person to connect to these games over NintendoClients ever! That reaffirmed my motivation to continue before these games were lost in less than a month.
According to Kinnay’s Wiki NEX is based on Quazal Rendez-Vous, which exposes a number of "protocols", each with endpoints. While NEX exposes most base protocols from QRV (including matchmaking, which is a key part of online play) no base protocol, other than the unused Persistent Store (24), exposes long term data storage used by games. So the juicy protocols that are worth scraping were generally made custom by Nintendo. The two protocols I settled on was the one used for leaderboards (Ranking 112) and the one used for miscellaneous data not stored externally (DataStore 115).
The major benefit of scraping NEX games is the compatibility they generally share with regards to how to access their data, so I didn’t have to divert time towards misbehaving games with the short month I had left. For example, the Ranking protocol is very simple to bulk scrape. My approach was using the following endpoint for the first 1000 results:
1ranking_client = ranking.RankingClient(client) 2 3order_param = ranking.RankingOrderParam() 4order_param.order_calc = ORDINAL_RANKING # When players share scores, the one who got the score first goes first 5order_param.offset = cur_offset # Starts at 0, increases each iteration by OFFSET_INTERVAL 6order_param.count = OFFSET_INTERVAL 7 8rankings = await ranking_client.get_ranking( 9 ranking.RankingMode.GLOBAL, # Get the global leaderboard 10 category, 11 order_param, 12 0, 13 0, 14)
Then switching to the following endpoint when the game stopped returning data, as quite a few games only displayed the first 1000 results in-game and only showed additional results when it returned your personal rank:
1ranking_client = ranking.RankingClient(client) 2 3order_param = ranking.RankingOrderParam() 4order_param.order_calc = ORDINAL_RANKING 5order_param.offset = 0 6order_param.count = OFFSET_INTERVAL 7 8rankings = await ranking_client.get_ranking( 9 ranking.RankingMode.GLOBAL_AROUND_SELF, # Get the leaderboard around this player 10 category, 11 order_param, 12 last_id_seen, 13 last_pid_seen, 14)
Well, great except for one thing: categories. Like the access keys, games do not report their categories. Unless specifically designed through another process, like a DataStore endpoint, categories are hard-coded into the game binary. Like the access keys I did not want to have to obtain game binaries, but unlike the access keys there is no way to locally brute force the complete list. I experimented with parallelizing network requests for categories but the results were unfortunate.
With that route closed, I had to settle with some heuristic. As seen above, some games are known to have bitwise categories (like Donkey Kong Country: Tropical Freeze), so any approach that just checked a smaller range of categories would likely miss some. I did not have an alternative that I could complete quickly, so I just chose the range 0 to 1000. That approach worked fine for most games, but it was clear from the beginning I was missing valuable data. Like Puyo Puyo Tetris, an egregious example that seemed to create new categories for every match, or Romance of the Three Kingdoms XII, whose one category of 223 seemed suspiciously far from 0.
Next, I tackled the DataStore protocol. This one had a lot more applicable endpoints, so I tested all of them on every game, seeing which ones were supported. From just a sample of them I could tell which endpoints were already out of the question.
DKC: TF | FotNS | Hyrule Warriors | Injustice: GAU | M&S 2014 | M&S 2016 | MK8 | Mighty N. 9 | |
---|---|---|---|---|---|---|---|---|
get_metas | X | X | ||||||
search_object | X | X | ||||||
get_ratings | X | X | ||||||
get_specific_meta_v1 | X | X | ||||||
get_rating_with_log | X | X | ||||||
get_persistence_infos | X | X | ||||||
prepare_get_object_or_meta_binary | X | X | ||||||
prepare_get_object | X | X | ||||||
prepare_get_object_v1 | X | X | ||||||
get_password_infos | X | X | ||||||
get_metas_multiple_param | X | X | ||||||
get_object_infos | X | X | X | X | X | X | X | |
search_object_light | X | X | X | X | X | X | X |
Some threw Core::NotImplemented
, which is a error built into the protocol, and some threw runtime errors for most inputs, like (ordered according to how scared I was to see them) PythonCore::ConversionError
, OverflowError: Buffer overflow
and DataStore::OperationNotAllowed
.
These results were interesting because games primarily use NEX as a data store, leaderboards are not nearly as common. And the games that did not support any DataStore endpoint usually didn’t support any Ranking endpoints either. Jon, from Pretendo, and I had theories (like the idea that a game that only supported p2p online play still needed to register itself as a NEX compatible server), but otherwise we had no idea.
I was disappointed to see get_object_infos
lack support almost across the board, because it is the only endpoint that supports bulk object requests by data ID, the primary way data is referenced in the data store. I decided on using search_object
with an offset to get the first data ID, falling back on search_object
with a timestamp before the start of the WiiU’s release.
1store = datastore.DataStoreClient(client) 2 3param = datastore.DataStoreSearchParam() 4param.result_range.offset = 0 5param.result_range.size = 1 6param.result_option = 0xFF 7res = await store.search_object(param) 8 9last_data_id = None 10if len(res.result) > 0: 11 last_data_id = res.result[0].data_id 12else: 13 # Try timestamp method from 2012 as a backup 14 param = datastore.DataStoreSearchParam() 15 param.created_after = common.DateTime.fromtimestamp( 16 1325401200 17 ) 18 param.result_range.size = 1 19 param.result_option = 0xFF 20 res = await store.search_object(param) 21 22 if len(res.result) > 0: 23 last_data_id = res.result[0].data_id 24 25if last_data_id is None or last_data_id > 900000: 26 # Just start here anyway lol 27 last_data_id = 900000
Then, with the first data ID obtained, I used the same assumption as my Mario Maker 2 scraping, which is that data IDs are sequential. I multiprocessed requests for both metadata and the data itself, using the same number of 16 processes. I used get_metas
, which supports a list of data IDs, and prepare_get_object
, which individually returns a HTTP URL with headers you can request data from.
A benefit of my approach so far was fault tolerance. They were designed to handle being terminated, either by me or a bug, without losing progress. I was writing all this data into individual SQLite databases, one each for Ranking and DataStore. I’m glad I implemented this because it became very applicable!
Once my heuristics were chosen and my scraping approach was finalized I started scraping… and debugging… and scraping… and debugging.
Music Source: Aphex Twin - Vordhosbn
My chosen platform was once again my desktop. It was connected by ethernet to my router, but unlike Mario Maker 2, where I had dorm internet, I was now at the whims of Xfinity. They frequently limited my connection and the max download speed was much weaker than I had hoped. After all my optimizations I was at the whim of my ISP.
I had my deadline firmly implanted in my brain, both at school and at home…
Update: as of 4/8, online play and other functionality that uses online communication will end service for Nintendo 3DS and Wii U software. Thank you very much for your continued support of our products.
— Nintendo of America (@NintendoAmerica) January 24, 2024
Find out more: https://t.co/VdIdewGmB5
By this point, it had been publicly stated I was scraping every game, so tensions and excitement were high…
Comment
byu/cheater00 from discussion
inDataHoarder
Progress, progress, progress until…
March 30th. 9 days before the shutdown.
The answer was no. It was time to expand the scope enormously. I had started with 63 WiiU games. I was now adding 143 3DS games to my queue. And this new list required access key cracking, changes in authentication, changes in NEX version handling and a lot more debugging. All in a fraction of the time I spent on one game in 2021; Mario Maker 2.
It was time to bring in EC2.
Music Source: Cursedsnake - Resting
I knew EC2 would increase the cost of this project enormously, but my principles of data as a tool for social good made it worth it. I wanted to ensure, if it ever became important to someone that something from the 12 year lifespan of the WiiU or the 13 year lifespan of the 3DS was preserved, they could have access to it and do whatever research or project they wanted with it. After all, my work as a reverse engineer makes me extremely reliant on data. Data is the lifeblood of my creative fervor.
So it was time to finish, full speed ahead. I was going to continue my scraping debugging cycle until someone at Nintendo blocked me in a way I couldn’t bypass.
And that they did, when at 6:22 PM MST the NNAS authentication servers for the WiiU started returning errors for every request. Unlike a few dedicated players, who stayed on the Nintendo Network for so long because they never allowed their WiiU to attempt to connect to NNAS (log in), my exact workflow required NNAS to work properly. I watched as my fellow soldiers equipped autoclickers and auto-play mods to evaid the impending doom of a log in attempt. My story here was over.
After the dust had settled and it became perfectly clear there was no longer any way of connecting to any Nintendo Network services I analyzed all the SQLite files I collected. This project became much more difficult to complete once I realized just how many individual SQLite files I had to merge, and especially because some of those databases would have conflicts if I naively combined them. As expected, I would probably have to write still more custom python.
Since I have switched to using a RAID 5 drive my filesystem has become faster and more reliable than when I used a 14 terabyte hard drive (as I did for the Mario Maker 2 scrape). Regardless of the improvements of my setup, I knew it would take a while, and I wasn’t in a hurry, as I envisioned my preliminary projects as being mostly artistic in nature. So that’s why this blogpost is coming out nearly a year after Nintendo disabled the NNAS authentication server on April 9th, 2024.
The 2 EC2 instances I ran did cost a lot… 120 bucks. For just one week that was a bit egregious, but I knew there would be damage going in. For example, the Mario Maker 2 API costs me approximately 120$ a year. I am no stranger to spending money to realize my ideal projects.
The approximate stats for Ranking and DataStore (number of entries) scraping is as follows:
Title ID | Count |
---|---|
00040000000AA700 | 5060 |
00040000000C9100 | 11445 |
00040000000D1000 | 5798 |
00040000000D4A00 | 102996 |
00040000000D6C00 | 52 |
00040000000DBD00 | 5440 |
00040000000E8100 | 34251 |
00040000000E9000 | 11798 |
00040000000ED500 | 58 |
00040000000EDA00 | 606 |
00040000000EF000 | 850100 |
00040000000F2200 | 840640 |
00040000000F4F00 | 31297 |
00040000000F5100 | 91108 |
00040000000F6B00 | 10104 |
00040000000FB000 | 1181 |
0004000000103F00 | 25462 |
0004000000108100 | 4710 |
000400000010BD00 | 2371 |
0004000000111E00 | 137695 |
0004000000112300 | 27 |
0004000000112500 | 128323 |
0004000000112600 | 35274 |
0004000000118500 | 109 |
0004000000125400 | 2421157 |
000400000012E900 | 3 |
000400000012F800 | 4957 |
0004000000134500 | 244 |
0004000000138900 | 69590 |
000400000013BB00 | 28260 |
000400000013C300 | 8110 |
0004000000140B00 | 6956 |
0004000000141E00 | 785 |
0004000000144400 | 157305 |
000400000014A400 | 292538 |
000400000014A900 | 12 |
000400000014BA00 | 15519 |
000400000014DF00 | 2263 |
0004000000150100 | 59951 |
0004000000154300 | 371 |
0004000000156900 | 5480 |
0004000000158E00 | 1373 |
0004000000159000 | 8591 |
000400000015A200 | 158067 |
000400000015DC00 | 14326 |
0004000000162300 | 237 |
0004000000162B00 | 4031 |
0004000000163C00 | 3208 |
0004000000169E00 | 190026 |
000400000016AD00 | 202656 |
0004000000183500 | 958 |
0004000000185F00 | 14 |
0004000000196D00 | 256589 |
0004000000197500 | 71555 |
0004000000198700 | 338866 |
0004000000198B00 | 5344 |
0004000000199900 | 21 |
000400000019A400 | 115 |
000400000019A800 | 10220 |
00040000001A2B00 | 253590 |
00040000001A4500 | 3054803 |
00040000001A7300 | 274 |
00040000001ACB00 | 63554 |
00040000001B3F00 | 44791 |
0005000010100600 | 964040 |
0005000010106900 | 2163988 |
000500001010E400 | 1011 |
000500001010EB00 | 10948309 |
000500001010FB00 | 38471 |
0005000010110100 | 6126 |
0005000010110900 | 352031 |
0005000010111700 | 133809 |
0005000010111C00 | 747 |
0005000010111F00 | 549722 |
0005000010113800 | 1352 |
0005000010113A00 | 635316 |
0005000010113B00 | 47741 |
0005000010116100 | 408930 |
000500001011AF00 | 2068610 |
000500001011C300 | 90175 |
0005000010129000 | 738878 |
000500001012B100 | 272986 |
000500001012BC00 | 9971182 |
000500001012C000 | 105242 |
000500001012F000 | 2921 |
0005000010131900 | 7583 |
0005000010132A00 | 325 |
0005000010135000 | 16 |
0005000010138A00 | 417377 |
0005000010144800 | 401980 |
0005000010145E00 | 221542 |
0005000010147400 | 797 |
0005000010149700 | 558319 |
000500001014D900 | 223575 |
0005000010150300 | 30562 |
0005000010173300 | 369993 |
000500001017AC00 | 2225 |
0005000010190300 | 2738821 |
0005000010193300 | 561 |
0005000010198F00 | 838151 |
000500001019A600 | 77343 |
00050000101A1400 | 6980 |
00050000101A5E00 | 92365 |
00050000101C0A00 | 15328 |
00050000101C5800 | 4238212 |
00050000101C5A00 | 29271 |
Type | Title ID | Count |
---|---|---|
METADATA | 0005000010106900 | 6729511 |
METADATA | 0005000010111700 | 4912843 |
METADATA | 0005000010144800 | 13952 |
METADATA | 000500001017CD00 | 212 |
METADATA | 0005000010100600 | 520467 |
METADATA | 000500001010EA00 | 13340 |
METADATA | 000500001010EB00 | 17741221 |
METADATA | 0005000010110900 | 58026 |
METADATA | 0005000010110E00 | 24093 |
METADATA | 0005000010111C00 | 15793 |
METADATA | 0005000010116100 | 895671 |
METADATA | 000500001011A800 | 840 |
METADATA | 0005000010149700 | 3993679 |
METADATA | 000500001014D900 | 827923 |
METADATA | 0005000010190300 | 3728394 |
METADATA | 00050000101BEB00 | 128191 |
METADATA | 0004000E00086400 | 232814 |
METADATA | 000400000011C400 | 8212012 |
METADATA | 0004000000162300 | 8908 |
METADATA | 0004000000178800 | 5274742 |
METADATA | 000400000017D500 | 224 |
METADATA | 00040000001B4A00 | 457047 |
METADATA | 00040000000EF000 | 29808 |
METADATA | 000400000012D800 | 221096 |
METADATA | 0004000000055D01 | 8063476 |
METADATA | 00040000000C9D00 | 1679 |
METADATA | 00040000000D4500 | 1927 |
METADATA | 0004000000124A00 | 862865 |
METADATA | 000400000012DC00 | 593736 |
METADATA | 0004000000131F00 | 116602 |
METADATA | 0004000000134600 | 16197440 |
METADATA | 0004000000149B00 | 2228 |
METADATA | 0004000000155000 | 5646 |
METADATA | 000400000015CD00 | 1432570 |
METADATA | 0004000000164800 | 618434 |
METADATA | 0004000000166B00 | 181636 |
METADATA | 0004000000169E00 | 180800 |
METADATA | 000400000016AD00 | 117 |
METADATA | 0004000000197500 | 89172 |
METADATA | 000400000019F600 | 539369 |
METADATA | 00040000001A0000 | 28810 |
METADATA | 00040000001ACB00 | 159 |
METADATA | 00040000001B1900 | 16580 |
METADATA | 00040000001B3F00 | 84172 |
METADATA | 00040000001B4300 | 3576 |
METADATA | 00040002000B8B01 | 18093 |
METADATA | 0005000010128C00 | 978 |
DATA | 0005000010106900 | 2225634 |
DATA | 0005000010144800 | 13863 |
DATA | 0005000010100600 | 504375 |
DATA | 000500001010EA00 | 12992 |
DATA | 000500001010EB00 | 735915 |
DATA | 0005000010110900 | 58023 |
DATA | 0005000010110E00 | 19789 |
DATA | 0005000010111C00 | 15792 |
DATA | 0005000010149700 | 977249 |
DATA | 0005000010190300 | 1776948 |
DATA | 0004000E00086400 | 232819 |
DATA | 000400000011C400 | 663073 |
DATA | 0004000000162300 | 8908 |
DATA | 00040000001B4A00 | 112433 |
DATA | 000400000012D800 | 152984 |
DATA | 000400000012DC00 | 322155 |
DATA | 0004000000134600 | 836886 |
DATA | 0004000000155000 | 5650 |
DATA | 000400000015CD00 | 673325 |
DATA | 0004000000164800 | 384792 |
DATA | 000400000016AD00 | 117 |
DATA | 000400000019F600 | 129319 |
DATA | 00040000001ACB00 | 159 |
DATA | 00040002000B8B01 | 15940 |
DATA | 000500001014D900 | 585 |
DATA | 00050000101BEB00 | 8936 |
DATA | 0004000000169E00 | 90 |
The project linked on this blogpost is the archived leaderboards of WiiU games, like Mario Kart 8 and Mario & Sonic 2016, displayed. I based the style off of the player leaderboards in Mario Maker 2. You can also select users to see all of the scores recorded on other WiiU games. Finally, I added a SRC (Speedrun.com) style WR timeline to get a better picture of the leaderboard, as long as that game reported an update_time
when calling get_ranking
. Please note that there is a limitation: since a player’s record is replaced when they beat it the only scores shown are the PB of each player, making the timeline less accurate.
If you happen to know a time or score you got you can find yourself in the archive, or if you happen to know your PID you can put it in the URL.
As the "scores" returned by get_ranking
are just integers, I generally inferred the datatype and formatted it for easier viewing. Additionally, as the categories have no additional metadata linking them to specific levels or leaderboards in game I tried my best to research each game to determine the likely order of categories. This research consisted of searching on YouTube for old ranked gameplay, finding the SRC leaderboards and looking at WRs and using a cool site called cyberscore.me.uk that went under my radar. My appreciation goes to them for enabling me to label quite a few games correctly.
For an example of a good leaderboard check out Stealth Inc 2’s 1-1 Forever a Clone.
I will be working on interpreting the DataStore archive I have for two big reasons: I really want to make level viewers and I really want to play back replays (especially from ACNL and MK8 respectively). So I will continue working on that. Look out for a future post about those!
Remember that the world is out there for the taking. With the help of just a few smart people at Pretendo I did something that was, and never will, be done again. With enough motivation, and a bit of time, you can do anything you put your mind to.