[00:20] *** achip has joined #archiveteam-bs [00:20] *** Ryz has joined #archiveteam-bs [00:20] *** Somebody2 has joined #archiveteam-bs [00:20] *** irc.Prison.NET sets mode: +o Somebody2 [00:20] *** Fusl__ sets mode: +o Ryz [00:20] *** Fusl sets mode: +o Ryz [00:20] *** Fusl_ sets mode: +o Ryz [00:20] *** svchfoo1 sets mode: +o Ryz [01:19] *** BlueMax has joined #archiveteam-bs [03:15] *** Quirk8 has quit IRC (END OF LINE) [03:26] *** qw3rty119 has joined #archiveteam-bs [03:28] *** Quirk8 has joined #archiveteam-bs [03:29] *** qw3rty118 has quit IRC (Ping timeout: 612 seconds) [03:33] *** Quirk8 has quit IRC (END OF LINE) [03:47] *** pew has quit IRC (Quit: WeeChat 1.6) [03:55] *** odemgi_ has joined #archiveteam-bs [03:57] *** odemgi has quit IRC (Read error: Operation timed out) [04:04] *** pew has joined #archiveteam-bs [04:06] *** fredgido has quit IRC (Remote host closed the connection) [04:07] *** fredgido has joined #archiveteam-bs [04:36] *** katocala has quit IRC (Read error: Operation timed out) [05:39] *** davie has joined #archiveteam-bs [05:40] I'm terribly sorry to bother you all, but I would like to ask: what's the best program to recover files from a 10TB WD drive originally NTFS and mistakenly formatted as exfat? active@ doesn't work. I'm thinking file carving using windows or solaris [05:40] Also your wiki is very informative and well written. [06:12] I've been using Recuva (https://www.ccleaner.com/recuva) for years, but it's not as common a task that I encounter anymore. I had used WinHex prior. [06:17] I would be surprised if a piece of software didn't exist that could identify filetype headers and footers and piece together data unfragmented sequential data. [06:21] oh, you might want to try TestDisk. I've had a lot of harrowing luck in the past. But you'll need to know a thing or two about disks, partitions, volumes, etc. https://www.cgsecurity.org/wiki/TestDisk [06:34] *** davie has quit IRC (Ping timeout: 260 seconds) [06:53] *** davie has joined #archiveteam-bs [06:53] weird, booted [08:02] *** deevious has joined #archiveteam-bs [08:03] davie: did you lose my response? [10:24] *** h3ndr1k has quit IRC (Ping timeout: 745 seconds) [11:14] *** luckcolor has joined #archiveteam-bs [11:52] *** katocala has joined #archiveteam-bs [11:56] davie: Whatever you do, make a full block-level copy of the entire drive and only operate on that copy. [11:59] *** katocala has quit IRC () [12:17] *** BlueMax has quit IRC (Quit: Leaving) [12:32] *** katocala has joined #archiveteam-bs [12:59] *** h3ndr1k has joined #archiveteam-bs [14:37] *** deevious has quit IRC (Quit: deevious) [14:46] I received no reply Raccoon , I'm sorry. [14:47] JAA I was going to DD the drive, but 10TB is difficult to manage. Hopefully it completes soon. [14:47] you could split it into say 2TB chunks or whatever's easiest to work with [14:47] That's a great idea thank you [14:47] you will miss things that span the boundaries (unless you deliberately overlap), so be aware of that. [14:48] (If you'd like to hang out here more often, I'd suggest you get a proper IRC client instead of using EFnet's crappy webchat thingy.) [15:01] davie: message query sent with 3 lines of chat history [15:03] not sure how reasonable it is to make sector perfect copy of a 10 TB drive :) [15:04] TestDisk seems like the way to go [15:19] Raccoon that's the consideration I was making. Frankly I really need this data, and it's of public interest. That being said I was extremely unprofessional in not using additional tools to verify volume designation for the media, and worse still exfat is a vexatious FS for this use case. Hopefully I can hit the drive with a good carving tool, and find the signatures I need to sort the data. [15:21] If I am successful I will share the signatures after uploading the data. Curiously enough there is some trouble in finding the signature profiles. I purchased every version of UFS Explorer and R-Studio tools, but right now I'm dealing with the dd/FTK aspect of preserving what I can. [15:23] Ideally you would do this using vfs in a flash storage appliance, but that's outside of the budget for a lowly sociologist archive novice [15:24] Also ++ on testdisk [15:24] I'd also encourage a backup regime :) [15:24] I'd take the time to read thoroughly through the TestDisk documentation. Ie: https://www.cgsecurity.org/wiki/TestDisk and https://www.cgsecurity.org/wiki/TestDisk_Step_By_Step and other google searches like http://www.tipsninja.com/testdisk-recover-lost-deleted-partitions/ [15:25] hopefully your Quick Format didn't destroy too much [15:25] this was a one off case, normally I comply with a data hygiene and retention schedule that was written by my attorney for other reasons. [15:26] if you like PDFs, here's a full documentation -- https://www.cgsecurity.org/testdisk.pdf [15:26] Thank you for the links. I'm curious though, does anyone here use EnCase? This is unrelated. [15:27] I'll review the docs, it's been a while since I used testdisk for this type of work. Thank you again for the links [15:28] Never heard of or used EnCase. It seems like a corporate non-gnu non-foss product [15:32] their software page looks like it's been around only a few years, but their company for a couple decades. https://web.archive.org/web/*/guidancesoftware.com/encase-forensic-imager [15:32] The forensics tool I grew up around was WinHex but I don't know what kind of status it holds today [15:37] EnCase is a great tool that has been evaluated extensively by DHS, NJIC, NIST, and others. It's extremely expensive though, and requires HASP dongles for licensing. [15:38] There are many professors for business forensics type of education that have "altered" useful versions for their students. I don't have that access as my education didn't require that type of experience. [15:38] Prolly start with just letting TestDisk perform a read-only diagnosis to see what it can find. It looks like it's very clear about when it writes to the disk [15:40] The reason for using encase is that you can build custom scripts and profiles that find what you need from a raw image. For example, there isn't another tool that can find bitcoin wallets or twitch xml logs from chat by scripting a plugin, to my knowledge. [15:40] Agree RE read only [15:41] you got me curious now, cuz I'm not finding any bitmap images of what exactly a QuickFormat looks like for various filesystems. Think I'm going to fill up a small test partition with 0x55 and then format it with each various FS and create a visual image of the writes that windows makes. maybe even repeat it for each major version of windows from 98 through 10 [15:41] Andrew Case sold Guidance Software to the Canadian company OpenText, and frankly, they are booties. [15:41] booties? [15:42] No public access to support documents or files of any sort unless you pay for the 10,000$ annual license to my knowledge [15:42] that looks like the the company that makes EnCase [15:42] they are jerks, really [15:43] Guidance was the original seller and very friendly for academics, maybe 2004 established. Sold in 2018 to a company that is tailored for law firms and feds. [15:44] but really, that's a pretty accurate carry-over of data recovery from the 1980's and 90's. Empty people's and corporate wallets while they're suffering from grief, panic, denial and bargaining [15:45] This is a real shame because I wrote hundreds of enscripts to do "speed forensics" and the new company is just not interested in academia [15:46] Ahh yes indeed Raccoon. Some of the best tools were acquired by now M&A companies to stop competitive use. [15:46] *** ndiddy has quit IRC (Remote host closed the connection) [15:46] well, good luck and let us know. i'm curious if you find a nice friendly tool that can identify and shape every filetype on the planet through header-footer [15:47] Even things like photorec aren't what they used to be. R-Studio was purchased by a german company even tho I think they are still using the developers out of Ukraine. [15:47] Will do Raccoon [15:47] i understand various hex editors have such features to nicely label what each byte offset is and what it means [15:47] *** ndiddy has joined #archiveteam-bs [15:48] since it was a 10 TB drive, your cluster size is going to be rather large, which means that your data's going to be fairly more sequential than with a tiny allocation size [15:49] so hopefully minimal fragmentation [15:49] hopefully not a very long operation period with lots of deletes and rewrittes [15:54] Tangential note, I have been considering getting with Scott and doing a massive group buy. Would be great if we could all have deepspar, x-ways, pc-3000, write blockers, tableau bridges, and the software to use those tools for 2,000$ instead of 50k. It would require a substantial group size, like maybe 6,000 people. [15:57] was x-ways (winhex) bought? [15:57] Oh and I forgot this earlier but using CDNs for storage of public software like AKAMI should be criminal in my opinion. There are thousands of broken links to old software on google top ten pages (100 per) because of the issues with transitioning from CDN for support files to local hosting behind a portal [16:00] *** DopefishJ has joined #archiveteam-bs [16:04] *** DFJustin has quit IRC (Ping timeout: 745 seconds) [16:06] index them and archive :) [16:43] *** SakoeraTy has joined #archiveteam-bs [16:47] SakoeraTy [16:47] Lets talk about it here [16:47] Okay [16:48] https://www.aniway.nl/forum/viewtopic.php?f=22&t=32961 [16:49] Heh [16:49] If I'm reading that correctly, the rest of the site is not affected, right? [16:49] It is too, but not this soon afaik [16:50] I see. [16:50] The magazine will get a rehaul, part of that will be a new website - this was already announced in the magazine's last issue [16:50] Now suddenly they announce the whole forums will be shut down :/ [16:50] 32,000 topics is quite small. [16:50] There is a new Discord to replace the forums apparently [16:51] It used to be bigger [16:51] Where do you see 32k topics? [16:51] Oh wait that's threads [16:51] 399k posts in 11.7k threads. [16:51] Sorry been a long day. [16:51] They had subforums for Dutch language anime/manga publishers [16:51] Which they closed down a few years ago [16:51] I presumed the &t= was threads [16:51] It is. [16:51] As part of a forum cleanup [16:51] But that's the thread ID, including all deleted threads and ones in private subforums etc. [16:52] Despite all the interesting talk there now being lost about Dutch manga translations, DVD subtitles, etc. [16:52] Let's do two AB jobs, one for the forums and one for the site. [16:52] I'm so glad Polish manga publishers still have at least one forum, and it's decently active. [16:53] Dutch anime DVD subs tended to vary a lot in quality, many badly translated from French to Dutch by non-native speakers of Dutch [16:53] Those forums were one place to check (incomplete) information on the quality of some of those [16:53] They used to vary here too, back when we had anime publishers. [16:53] Also localisation decisions made by the translators were shared there [17:02] I just noticed I said Friday, but I meant Thursday [17:16] *** ShellyRol has quit IRC (Read error: Operation timed out) [17:16] *** ShellyRol has joined #archiveteam-bs [18:01] *** C4K3 has joined #archiveteam-bs [19:11] *** ShellyRol has quit IRC (Ping timeout: 496 seconds) [19:12] *** ShellyRol has joined #archiveteam-bs [19:13] *** HashbangI has joined #archiveteam-bs [19:50] *** DopefishJ is now known as DFJustin [19:52] *** DFJustin has quit IRC (Remote host closed the connection) [19:52] *** DFJustin has joined #archiveteam-bs [19:52] *** killsushi has joined #archiveteam-bs [19:56] *** SakoeraTy has quit IRC (Ping timeout: 745 seconds) [21:03] *** Quirk8 has joined #archiveteam-bs [21:03] davie: ++ on R-Studio. At least back in the day 2005-2009 when I was my company's "data recovery specialist", it was very useful to find remnants of old filesystems many, many times and was definitely worth the money. Dunno about nowadays tho [21:07] *** fredgido has quit IRC (Remote host closed the connection) [21:08] *** fredgido has joined #archiveteam-bs [21:11] RStudio? What does a statistics software interface have to do with data recovery? ;-) [21:14] *** Pixi` has quit IRC (Read error: Operation timed out) [21:22] *** davie has quit IRC (Ping timeout: 260 seconds) [21:27] JAA: probably to identify data that isn't cryptographically random? [21:31] Yeah, distinguishing data from noise [21:45] *** Quirk8 has quit IRC (END OF LINE) [21:46] i hear a lot of people shooting video of hurricane dorian, vertically. they're the reason god is punishing us. [21:46] oop, wrong ch [21:47] *** Quirk8 has joined #archiveteam-bs [21:50] *** Pixi has joined #archiveteam-bs [22:02] *** SmileyG has joined #archiveteam-bs [22:04] *** tuluu_ has joined #archiveteam-bs [22:06] *** Raccoon has quit IRC (Ping timeout: 258 seconds) [22:06] *** Smiley has quit IRC (Ping timeout: 258 seconds) [22:06] *** tuluu has quit IRC (Ping timeout: 258 seconds) [22:06] *** Gfy has quit IRC (Ping timeout: 258 seconds) [22:06] *** Laverne has quit IRC (Ping timeout: 258 seconds) [22:06] *** luckcolor has quit IRC (Ping timeout: 258 seconds) [22:06] *** Laverne has joined #archiveteam-bs [22:06] *** luckcolor has joined #archiveteam-bs [22:07] *** godane has quit IRC (Leaving.) [22:07] *** atbk_ has joined #archiveteam-bs [22:08] *** Raccoon has joined #archiveteam-bs [22:10] *** Gfy has joined #archiveteam-bs [22:11] *** atbk has quit IRC (Ping timeout: 746 seconds) [22:18] *** mc2 has quit IRC (Ping timeout: 360 seconds) [22:50] *** BlueMax has joined #archiveteam-bs [23:19] *** SmileyG has quit IRC (Read error: Operation timed out) [23:22] *** Smiley has joined #archiveteam-bs