Wii U & 3DS Mass Scraping + Leaderboards

Created Sunday, March 2, 2025

In late 2023 Nintendo posted a tweet that we had anticipated for years:

This revelation set a lot of things in motion. Early on, some notable archival projects were started with plenty of time to prepare. Some of these were run by developers with experience in particular games using NintendoClients, like Super Mario Maker levels. Some others requested the community submit StreetPass data from modded consoles. Most projects were involved with the Pretendo group, who are the most knowledgeable group about Wii U online reverse engineering.

I was tipped off to the impending shutdown of online service for every Wii U and 3DS game, including user created data, and that reminded me of my prior projects to archive Mario Maker 2 data and parse and display it. I realized there was a likely opportunity to use my skills and there was a good chance a lot of games were not popular enough to be receiving enough attention to be archived. So around 2 months before the shutdown I began talks with the Pretendo team to archive every game, indiscriminately and exhaustively.

This blogpost is about my effort to save every Wii U and 3DS game from the brink.

link Is This Homebrew?

If you’ve been following my projects, you would know I usually use on-console homebrew to conduct exploratory research, then switch my efforts to automating via NintendoClients. This project was uniquely different:

I don’t own a Wii U?! I do not own any Wii U games?!

Unlike the Nintendo Switch there are no application tokens necessary to connect to each games' online services individually. You can get full access to any online services with a free Nintendo Network account. Once you have Nintendo Network credentials and specify some additional data (title ID, title version, NEX protocol version and a hardcoded 8 character hex string called the access key) you can connect to the services exposed by the game. I obtained some credentials and was off to the races, no games required.

But I did need a list of games! Kinnay, who designed NintendoClients, also brute forced a list of all games on the WiiU by incrementing the title ID by 0x100 and requesting game metadata from the eshop servers. Importantly for me, he did generate a list of all games with online support, including IP addresses and ports, but it isn’t that simple. Initially, that list was quite short, including only the games confirmed with a MITM proxy, but I thought there could be some missing games after talking with Pretendo.

link A Basic WiiU Connection

 1TITLE_ID = "0005..."
 2TITLE_VERSION = 8377
 3NEX_VERSION = [3, 10, 0] # 3.10.0
 4ACCESS_KEY = "..."
 5
 6nas = nnas.NNASClient()
 7nas.set_device(DEVICE_ID, SERIAL_NUMBER, SYSTEM_VERSION)
 8nas.set_title(int(TITLE_ID, 16), TITLE_VERSION)
 9nas.set_locale(REGION_ID, COUNTRY_NAME, LANGUAGE)
10
11access_token = await nas.login(USERNAME, PASSWORD)
12
13# Generally the last 8 hex digits of the Japanese version of the game
14nex_token = await nas.get_nex_token(access_token.token, int(TITLE_ID[8:], 16))
15
16nex_version = (
17    game["nex"][0] * 10000
18    + game["nex"][1] * 100
19    + game["nex"][2]
20)
21
22s = settings.default()
23s.configure(ACCESS_KEY, nex_version)
24
25async with backend.connect(s, nex_token.host, nex_token.port) as be:
26    async with be.login(str(nex_token.pid), nex_token.password, None) as client:
27        # Websocket connection started, onward!
28        store = datastore.DataStoreClient(client)

As it happens, there were. There were 2 key pieces of data that were not present in wiiu.json from Kinnay. The first is the title version, which I was quickly able to fix with the IDBE archive collected by Pretendo. The second bit was significantly harder and led me into uncharted territory on the WiiU…

link Access Keys

Unlike title versions access keys are not obtainable from any eshop endpoint. These strings are generally identified by reverse engineering the binaries of each game and looking for a specific known pattern. This pattern was made pretty clear by the keys published by Kinnay:

7b9b09cb
24e0a63b
59d539a9
7fcf384a
...

The access key is just 8 lowercase hex digits, so a 32 bit number. So rather than bulk obtaining game binaries the approach that made sense to me was brute forcing.

Brute forcing over the network would have been prohibitively time consuming (\(2^{32}\) seconds is 136 years!) but I could adopt a strategy used by advanced password recovery tools like Hashcat. Those tools, instead of trying random strings on some endpoint, apply some operation (usually concatenation with a salt and then hashing) and attempt to match against some known hash, allowing for local testing that could be massively parallelized. The operation I chose was similar: upon connection send a SYN packet and store the encrypted SYN packet sent back:

 1# Firstly, obtain one SYN packet
 2syn_packet = SynPacket()
 3syn_packet_lock = threading.Lock()
 4syn_packet_lock.acquire()
 5
 6# WiiU is UDP
 7async with udp.connect(nex_token.host, nex_token.port) as socket:
 8    async with util.create_task_group() as group:
 9        transport = prudp.PRUDPClientTransport(s, socket, group)
10
11        async def process_incoming():
12            while True:
13                data = await transport.socket.recv()
14
15                with util.catch(Exception):
16                    packets = transport.packet_encoder.decode(data)
17                    for packet in packets:
18                        if packet.type == prudp.TYPE_SYN:
19                            syn_packet.packet = packet
20                            syn_packet.syn_packet_options = (
21                                transport.packet_encoder.encode_options(
22                                    packet
23                                )
24                            )
25                            syn_packet.syn_packet_header = (
26                                transport.packet_encoder.encode_header(
27                                    packet,
28                                    len(syn_packet.syn_packet_options),
29                                )
30                            )
31                            syn_packet.syn_packet_payload = packet.payload
32                            syn_packet.syn_packet_signature = (
33                                packet.signature
34                            )
35                        else:
36                            await transport.process_packet(packet)
37
38        transport.group.start_soon(process_incoming)
39
40        client = prudp.PRUDPClient(s, transport, s["prudp.version"])
41        with transport.ports.bind(client, type=10) as local_port:
42            client.bind(socket.local_address(), local_port, 10)
43            client.connect(socket.remote_address(), 1, 10)
44
45            async with client:
46                client.scheduler = scheduler.Scheduler(group)
47                client.scheduler.start()
48
49                client.resend_timeout = 0.05
50                client.resend_limit = 0
51
52                try:
53                    await client.send_syn()
54                    await client.handshake_event.wait()
55
56                    if client.state == prudp.STATE_CONNECTED:
57                        None
58
59                    syn_packet_lock.release()
60                except RuntimeError:
61                    None
62
63    syn_packet_lock.acquire()
64    syn_packet_lock.release()

Then apply the same operation used internally to both decrypt the packet and determine its authenticity:

 1def test_access_key(access_key, syn_packet):
 2    key = hashlib.md5(access_key.encode()).digest()
 3    mac = hmac.new(key, digestmod=hashlib.md5)
 4    mac.update(syn_packet.syn_packet_header[4:])
 5    mac.update(b"")
 6    mac.update(struct.pack("<I", sum(access_key.encode())))
 7    mac.update(b"")
 8    mac.update(syn_packet.syn_packet_options)
 9    mac.update(syn_packet.syn_packet_payload)
10
11    return mac.digest() == syn_packet.syn_packet_signature

Note the similarity to the procedure used in NintendoClients for encoding a packet

 1class PRUDPMessageV1:
 2	def __init__(self, settings):
 3		self.access_key = settings["prudp.access_key"].encode()
 4	
 5	def signature_size(self): return 16
 6	
 7	def calc_packet_signature(self, packet, session_key, connection_signature):
 8		options = self.encode_options(packet)
 9		header = self.encode_header(packet, len(options))
10		
11		key = hashlib.md5(self.access_key).digest()
12		mac = hmac.new(key, digestmod=hashlib.md5)
13		mac.update(header[4:])
14		mac.update(session_key)
15		mac.update(struct.pack("<I", sum(self.access_key)))
16		mac.update(connection_signature)
17		mac.update(options)
18		mac.update(packet.payload)
19		return mac.digest()

Under the birthday bound I expected to check only \(2^{31}\) access keys before finding the correct one, but this was not a big enough improvement. Before running this code in multiple processes it would take more than 1.5 hours per check. With 746 games advertising themselves as NEX compatible this process would take more than a month, which was unacceptable. So in order to finish on time I then ran this check in 16 processes simultaneously on my Macbook M1, which is the machine I own with the best CPU. I chose to split the \(2^{32}\) domain space evenly between each process so they wouldn’t waste time syncing with the other processes.

 1def range_test_access_key(
 2    i, syn_packet, host, port, title_id, found_key, done_flag
 3):
 4    interval = int(pow(2, 32) / NUM_PROCESSES)
 5
 6    for number_key_base in range(interval):
 7        number_key = number_key_base + i * interval
 8
 9        if number_key_base % 1000000 == 0:
10            # Check occasionally if it's time to terminate
11            if done_flag.value:
12                return
13
14        string_key = hex(number_key)[2:].rjust(8, "0")
15        if test_access_key(string_key, syn_packet):
16            entry = "%s, %s, %s, %s, (%d)" % (
17                hex(title_id)[2:].upper().rjust(16, "0"),
18                hex(title_id)[-8:].upper(),
19                string_key,
20                host,
21                port,
22            )
23
24            list_file = open(LIST_PATH, "a")
25            list_file.write("%s\n" % entry)
26            list_file.flush()
27            list_file.close()
28
29            print(entry)
30
31            found_key.value = ("%s" % string_key).encode()
32
33            done_flag.value = True
34            break
35
36found_key_lock = Lock()
37found_key = Array("c", 10, lock=found_key_lock)
38
39processes = [
40    Process(
41        target=range_test_access_key,
42        args=(
43            i,
44            syn_packet,
45            nex_token.host,
46            nex_token.port,
47            int(TITLE_ID, 16),
48            found_key,
49        ),
50    )
51    for i in range(NUM_PROCESSES)
52]
53
54for p in processes:
55    p.start()
56for p in processes:
57    p.join()
58
59if found_key.value:
60    possible_access_keys.add(found_key.value.decode("utf-8"))

With this code I managed to bring down the time per key to 8 minutes, which was easily quick enough when leaving my M1 on overnight (and bringing my laptop to college with the process still running!). This was one of the few times I managed to push my M1 to 100% CPU usage, which was a good sign to me I was writing code that got the most out of my hardware.

After generating this list over 4 days, I was surprised to find that nearly all of the access keys I had cracked had 0 results on Google and Github. I was not just the first person to scrape these games, I was likely the first person to connect to these games over NintendoClients ever! That reaffirmed my motivation to continue before these games were lost in less than a month.

link Choosing The Protocols

According to Kinnay’s Wiki NEX is based on Quazal Rendez-Vous, which exposes a number of "protocols", each with endpoints. While NEX exposes most base protocols from QRV (including matchmaking, which is a key part of online play) no base protocol, other than the unused Persistent Store (24), exposes long term data storage used by games. So the juicy protocols that are worth scraping were generally made custom by Nintendo. The two protocols I settled on was the one used for leaderboards (Ranking 112) and the one used for miscellaneous data not stored externally (DataStore 115).

The major benefit of scraping NEX games is the compatibility they generally share with regards to how to access their data, so I didn’t have to divert time towards misbehaving games with the short month I had left. For example, the Ranking protocol is very simple to bulk scrape. My approach was using the following endpoint for the first 1000 results:

 1ranking_client = ranking.RankingClient(client)
 2
 3order_param = ranking.RankingOrderParam()
 4order_param.order_calc = ORDINAL_RANKING # When players share scores, the one who got the score first goes first
 5order_param.offset = cur_offset # Starts at 0, increases each iteration by OFFSET_INTERVAL
 6order_param.count = OFFSET_INTERVAL
 7
 8rankings = await ranking_client.get_ranking(
 9    ranking.RankingMode.GLOBAL, # Get the global leaderboard
10    category,
11    order_param,
12    0,
13    0,
14)

Then switching to the following endpoint when the game stopped returning data, as quite a few games only displayed the first 1000 results in-game and only showed additional results when it returned your personal rank:

 1ranking_client = ranking.RankingClient(client)
 2
 3order_param = ranking.RankingOrderParam()
 4order_param.order_calc = ORDINAL_RANKING
 5order_param.offset = 0
 6order_param.count = OFFSET_INTERVAL
 7
 8rankings = await ranking_client.get_ranking(
 9    ranking.RankingMode.GLOBAL_AROUND_SELF,  # Get the leaderboard around this player
10    category,
11    order_param,
12    last_id_seen,
13    last_pid_seen,
14)

Well, great except for one thing: categories. Like the access keys, games do not report their categories. Unless specifically designed through another process, like a DataStore endpoint, categories are hard-coded into the game binary. Like the access keys I did not want to have to obtain game binaries, but unlike the access keys there is no way to locally brute force the complete list. I experimented with parallelizing network requests for categories but the results were unfortunate.

With that route closed, I had to settle with some heuristic. As seen above, some games are known to have bitwise categories (like Donkey Kong Country: Tropical Freeze), so any approach that just checked a smaller range of categories would likely miss some. I did not have an alternative that I could complete quickly, so I just chose the range 0 to 1000. That approach worked fine for most games, but it was clear from the beginning I was missing valuable data. Like Puyo Puyo Tetris, an egregious example that seemed to create new categories for every match, or Romance of the Three Kingdoms XII, whose one category of 223 seemed suspiciously far from 0.

Next, I tackled the DataStore protocol. This one had a lot more applicable endpoints, so I tested all of them on every game, seeing which ones were supported. From just a sample of them I could tell which endpoints were already out of the question.

DKC: TFFotNSHyrule WarriorsInjustice: GAUM&S 2014M&S 2016MK8Mighty N. 9
get_metasXX
search_objectXX
get_ratingsXX
get_specific_meta_v1XX
get_rating_with_logXX
get_persistence_infosXX
prepare_get_object_or_meta_binaryXX
prepare_get_objectXX
prepare_get_object_v1XX
get_password_infosXX
get_metas_multiple_paramXX
get_object_infosXXXXXXX
search_object_lightXXXXXXX

Some threw Core::NotImplemented, which is a error built into the protocol, and some threw runtime errors for most inputs, like (ordered according to how scared I was to see them) PythonCore::ConversionError, OverflowError: Buffer overflow and DataStore::OperationNotAllowed.

These results were interesting because games primarily use NEX as a data store, leaderboards are not nearly as common. And the games that did not support any DataStore endpoint usually didn’t support any Ranking endpoints either. Jon, from Pretendo, and I had theories (like the idea that a game that only supported p2p online play still needed to register itself as a NEX compatible server), but otherwise we had no idea.

I was disappointed to see get_object_infos lack support almost across the board, because it is the only endpoint that supports bulk object requests by data ID, the primary way data is referenced in the data store. I decided on using search_object with an offset to get the first data ID, falling back on search_object with a timestamp before the start of the WiiU’s release.

 1store = datastore.DataStoreClient(client)
 2
 3param = datastore.DataStoreSearchParam()
 4param.result_range.offset = 0
 5param.result_range.size = 1
 6param.result_option = 0xFF
 7res = await store.search_object(param)
 8
 9last_data_id = None
10if len(res.result) > 0:
11    last_data_id = res.result[0].data_id
12else:
13    # Try timestamp method from 2012 as a backup
14    param = datastore.DataStoreSearchParam()
15    param.created_after = common.DateTime.fromtimestamp(
16        1325401200
17    )
18    param.result_range.size = 1
19    param.result_option = 0xFF
20    res = await store.search_object(param)
21
22    if len(res.result) > 0:
23        last_data_id = res.result[0].data_id
24
25if last_data_id is None or last_data_id > 900000:
26    # Just start here anyway lol
27    last_data_id = 900000

Then, with the first data ID obtained, I used the same assumption as my Mario Maker 2 scraping, which is that data IDs are sequential. I multiprocessed requests for both metadata and the data itself, using the same number of 16 processes. I used get_metas, which supports a list of data IDs, and prepare_get_object, which individually returns a HTTP URL with headers you can request data from.

A benefit of my approach so far was fault tolerance. They were designed to handle being terminated, either by me or a bug, without losing progress. I was writing all this data into individual SQLite databases, one each for Ranking and DataStore. I’m glad I implemented this because it became very applicable!

link Grinding… or Babysitting

Once my heuristics were chosen and my scraping approach was finalized I started scraping… and debugging… and scraping… and debugging.

Music Source: Aphex Twin - Vordhosbn

My chosen platform was once again my desktop. It was connected by ethernet to my router, but unlike Mario Maker 2, where I had dorm internet, I was now at the whims of Xfinity. They frequently limited my connection and the max download speed was much weaker than I had hoped. After all my optimizations I was at the whim of my ISP.

I had my deadline firmly implanted in my brain, both at school and at home…

By this point, it had been publicly stated I was scraping every game, so tensions and excitement were high…

Comment
byu/cheater00 from discussion
inDataHoarder

Progress, progress, progress until…

March 30th. 9 days before the shutdown.

The answer was no. It was time to expand the scope enormously. I had started with 63 WiiU games. I was now adding 143 3DS games to my queue. And this new list required access key cracking, changes in authentication, changes in NEX version handling and a lot more debugging. All in a fraction of the time I spent on one game in 2021; Mario Maker 2.

link Scaling Up

It was time to bring in EC2.

Music Source: Cursedsnake - Resting

I knew EC2 would increase the cost of this project enormously, but my principles of data as a tool for social good made it worth it. I wanted to ensure, if it ever became important to someone that something from the 12 year lifespan of the WiiU or the 13 year lifespan of the 3DS was preserved, they could have access to it and do whatever research or project they wanted with it. After all, my work as a reverse engineer makes me extremely reliant on data. Data is the lifeblood of my creative fervor.

So it was time to finish, full speed ahead. I was going to continue my scraping debugging cycle until someone at Nintendo blocked me in a way I couldn’t bypass.

And that they did, when at 6:22 PM MST the NNAS authentication servers for the WiiU started returning errors for every request. Unlike a few dedicated players, who stayed on the Nintendo Network for so long because they never allowed their WiiU to attempt to connect to NNAS (log in), my exact workflow required NNAS to work properly. I watched as my fellow soldiers equipped autoclickers and auto-play mods to evaid the impending doom of a log in attempt. My story here was over.

link Results

After the dust had settled and it became perfectly clear there was no longer any way of connecting to any Nintendo Network services I analyzed all the SQLite files I collected. This project became much more difficult to complete once I realized just how many individual SQLite files I had to merge, and especially because some of those databases would have conflicts if I naively combined them. As expected, I would probably have to write still more custom python.

Since I have switched to using a RAID 5 drive my filesystem has become faster and more reliable than when I used a 14 terabyte hard drive (as I did for the Mario Maker 2 scrape). Regardless of the improvements of my setup, I knew it would take a while, and I wasn’t in a hurry, as I envisioned my preliminary projects as being mostly artistic in nature. So that’s why this blogpost is coming out nearly a year after Nintendo disabled the NNAS authentication server on April 9th, 2024.

The 2 EC2 instances I ran did cost a lot… 120 bucks. For just one week that was a bit egregious, but I knew there would be damage going in. For example, the Mario Maker 2 API costs me approximately 120$ a year. I am no stranger to spending money to realize my ideal projects.

The approximate stats for Ranking and DataStore (number of entries) scraping is as follows:

link Ranking Results

Title IDCount
00040000000AA7005060
00040000000C910011445
00040000000D10005798
00040000000D4A00102996
00040000000D6C0052
00040000000DBD005440
00040000000E810034251
00040000000E900011798
00040000000ED50058
00040000000EDA00606
00040000000EF000850100
00040000000F2200840640
00040000000F4F0031297
00040000000F510091108
00040000000F6B0010104
00040000000FB0001181
0004000000103F0025462
00040000001081004710
000400000010BD002371
0004000000111E00137695
000400000011230027
0004000000112500128323
000400000011260035274
0004000000118500109
00040000001254002421157
000400000012E9003
000400000012F8004957
0004000000134500244
000400000013890069590
000400000013BB0028260
000400000013C3008110
0004000000140B006956
0004000000141E00785
0004000000144400157305
000400000014A400292538
000400000014A90012
000400000014BA0015519
000400000014DF002263
000400000015010059951
0004000000154300371
00040000001569005480
0004000000158E001373
00040000001590008591
000400000015A200158067
000400000015DC0014326
0004000000162300237
0004000000162B004031
0004000000163C003208
0004000000169E00190026
000400000016AD00202656
0004000000183500958
0004000000185F0014
0004000000196D00256589
000400000019750071555
0004000000198700338866
0004000000198B005344
000400000019990021
000400000019A400115
000400000019A80010220
00040000001A2B00253590
00040000001A45003054803
00040000001A7300274
00040000001ACB0063554
00040000001B3F0044791
0005000010100600964040
00050000101069002163988
000500001010E4001011
000500001010EB0010948309
000500001010FB0038471
00050000101101006126
0005000010110900352031
0005000010111700133809
0005000010111C00747
0005000010111F00549722
00050000101138001352
0005000010113A00635316
0005000010113B0047741
0005000010116100408930
000500001011AF002068610
000500001011C30090175
0005000010129000738878
000500001012B100272986
000500001012BC009971182
000500001012C000105242
000500001012F0002921
00050000101319007583
0005000010132A00325
000500001013500016
0005000010138A00417377
0005000010144800401980
0005000010145E00221542
0005000010147400797
0005000010149700558319
000500001014D900223575
000500001015030030562
0005000010173300369993
000500001017AC002225
00050000101903002738821
0005000010193300561
0005000010198F00838151
000500001019A60077343
00050000101A14006980
00050000101A5E0092365
00050000101C0A0015328
00050000101C58004238212
00050000101C5A0029271

link DataStore Results

TypeTitle IDCount
METADATA00050000101069006729511
METADATA00050000101117004912843
METADATA000500001014480013952
METADATA000500001017CD00212
METADATA0005000010100600520467
METADATA000500001010EA0013340
METADATA000500001010EB0017741221
METADATA000500001011090058026
METADATA0005000010110E0024093
METADATA0005000010111C0015793
METADATA0005000010116100895671
METADATA000500001011A800840
METADATA00050000101497003993679
METADATA000500001014D900827923
METADATA00050000101903003728394
METADATA00050000101BEB00128191
METADATA0004000E00086400232814
METADATA000400000011C4008212012
METADATA00040000001623008908
METADATA00040000001788005274742
METADATA000400000017D500224
METADATA00040000001B4A00457047
METADATA00040000000EF00029808
METADATA000400000012D800221096
METADATA0004000000055D018063476
METADATA00040000000C9D001679
METADATA00040000000D45001927
METADATA0004000000124A00862865
METADATA000400000012DC00593736
METADATA0004000000131F00116602
METADATA000400000013460016197440
METADATA0004000000149B002228
METADATA00040000001550005646
METADATA000400000015CD001432570
METADATA0004000000164800618434
METADATA0004000000166B00181636
METADATA0004000000169E00180800
METADATA000400000016AD00117
METADATA000400000019750089172
METADATA000400000019F600539369
METADATA00040000001A000028810
METADATA00040000001ACB00159
METADATA00040000001B190016580
METADATA00040000001B3F0084172
METADATA00040000001B43003576
METADATA00040002000B8B0118093
METADATA0005000010128C00978
DATA00050000101069002225634
DATA000500001014480013863
DATA0005000010100600504375
DATA000500001010EA0012992
DATA000500001010EB00735915
DATA000500001011090058023
DATA0005000010110E0019789
DATA0005000010111C0015792
DATA0005000010149700977249
DATA00050000101903001776948
DATA0004000E00086400232819
DATA000400000011C400663073
DATA00040000001623008908
DATA00040000001B4A00112433
DATA000400000012D800152984
DATA000400000012DC00322155
DATA0004000000134600836886
DATA00040000001550005650
DATA000400000015CD00673325
DATA0004000000164800384792
DATA000400000016AD00117
DATA000400000019F600129319
DATA00040000001ACB00159
DATA00040002000B8B0115940
DATA000500001014D900585
DATA00050000101BEB008936
DATA0004000000169E0090

link Archived Leaderboards

The project linked on this blogpost is the archived leaderboards of WiiU games, like Mario Kart 8 and Mario & Sonic 2016, displayed. I based the style off of the player leaderboards in Mario Maker 2. You can also select users to see all of the scores recorded on other WiiU games. Finally, I added a SRC (Speedrun.com) style WR timeline to get a better picture of the leaderboard, as long as that game reported an update_time when calling get_ranking. Please note that there is a limitation: since a player’s record is replaced when they beat it the only scores shown are the PB of each player, making the timeline less accurate.

If you happen to know a time or score you got you can find yourself in the archive, or if you happen to know your PID you can put it in the URL.

As the "scores" returned by get_ranking are just integers, I generally inferred the datatype and formatted it for easier viewing. Additionally, as the categories have no additional metadata linking them to specific levels or leaderboards in game I tried my best to research each game to determine the likely order of categories. This research consisted of searching on YouTube for old ranked gameplay, finding the SRC leaderboards and looking at WRs and using a cool site called cyberscore.me.uk that went under my radar. My appreciation goes to them for enabling me to label quite a few games correctly.

For an example of a good leaderboard check out Stealth Inc 2’s 1-1 Forever a Clone.

link Future Work

I will be working on interpreting the DataStore archive I have for two big reasons: I really want to make level viewers and I really want to play back replays (especially from ACNL and MK8 respectively). So I will continue working on that. Look out for a future post about those!

Remember that the world is out there for the taking. With the help of just a few smart people at Pretendo I did something that was, and never will, be done again. With enough motivation, and a bit of time, you can do anything you put your mind to.