Take a look. There’s something new happening.
First things first: What’s Patreon?
Patreon is a way that you can support DFIR Training and at the same time, get some real benefits. With support, DFIR Training (the website and Patreon page) will be able to grow and try to reach the expectations I want. By try, I mean that I have high expectations with what I want to do with both the DFIR Training website and Patreon.
Next thing. Support = donations.
By supporting the DFIR Training website and Patreon page, I mean that donations are needed. The website is free to access everything on it, and will always be free . Nothing will ever be behind a paywall. However, to help it grow to way more than it is, I need support. Your support means that I can dedicate more time outside of my regular time to give more content to both the website and Patreon. I want to really go 100mph on this to make both the website and Patreon the place to go for all things DFIR, or at least, one of the top places (I have big expectations....just the way I am...).
Your benefit in supporting
I don’t want donations for nothing in return. I’ll work hard to earn any support I deserve. For that, the Patreon DFIR Training page will give you rewards at different levels of support, depending on how much you want to support. For as little as a few dollars a month to a little more, every bit helps me to spend more time on you and the community.
Some of your benefits are access to Patreon supporter only content. Content like software reviews , software comparisons , f ace-to-face interviews with practitioners in the field , case studies , business and education tips , and Patreon-only chatting it up with me and friends in the community.
Other benefits are access to training courses that I put together. Some of the courses I have, you will be able to access for as long as you are a Patreon supporter! Courses like Placing the Suspect Behind the Keyboard , the X-Ways Forensics Practitioner's Guide online course , and more. Here’s a big thing that I believe in: when you spend time learning, it is an added benefit to be able to formally document your time. The courses that I will give you access to include certificates of completion (not ‘competence’ or ‘skill’, as I can’t test your skill!). But you will have documented proof of your time to complete courses which can be beneficial for your work training records, resume, or court testimony in self-learning, formal courses, and hours of training. This is a big deal because simply watching videos on YouTube doesn’t cut it in court! And I'll keep adding courses for as long as I have something to teach!
On a related note, this is a way to get involved in the DFIR community, even if just a little as your input goes from your mouth to my ear. I am always one to sing the praises of good ideas that someone has. My benefit is seeing someone excel. My enjoyment is being able to be a part of helping someone else. Even if you are an 'expert' (we all know about that word, but you know what I mean), your support is appreciated just as much as your words and opinions. All in DFIR, all wanting to into DFIR, and all related to DFIR are part of the same community. That's the point of this Patreon and DFIR Training work forever-in-progress, forever-improving.
The bad news
I am limiting the number of Patreon supporters at each level at first . I will increase the number as time goes on, but I want to keep the community support fairly small in the beginning to really focus on those who want to support. That means I want to focus on you and provide you what you would like to see. So, if you want to get in now, by all means, jump in!
The good news
I take you at your word and request. If you want to see something specific on DFIR Training or the Patreon page, as a supporter, I will do everything I can to make it happen. You can turn it into something that benefits you and the community, and you will be a most valuable part of all of it. You'll get the benefits that no one will by being a supporter.
In advance, thank you. It takes an amazing amount of time and effort to put materials together and then create a platform for the community to use those materials. Your support is so appreciated, I don’t know how to express it enough, except to say, thank you.
So there you have it!
Take a look at the Patreon page here: https://www.patreon.com/DFIRtraining
Let me know what you think . If you don’t want to support for some reason, let me know what it is and maybe I can make it happen. I try to think of everything, but more brains work better together. I’ll see you on Patreon!
In searching for DFIR tools over the years, I have found lots of “Top 10” lists. I feel that there can be a few improvements made with many of these lists. Here are some pointers on what to look for when looking for the “best DFIR tools”, which is what I look for.
Anything list with a “Top Ten” without an accompanying specific purpose is not as useful as a list that is specific. “Top 10 Registry Forensics Tools” is much better than “Top Ten Forensic Tools”. A generic list is practically useless if you are looking for something specific to accomplish a specific task. I’ve also seen lists that had such incorrect information, that the writer could not possibly have even tried the tool chosen as a “top 10” along with tools chosen that were clearly inappropriate to the list.
Even more specific, and just as important, is that of the licensing of software. Most lists that I have seen are a combination of commercial, shareware, freeware, and open source software. A much better list than “Top 10 Digital Forensic Tools” would be “Top 10 Open Source Digital Forensic Tools” or “Top 10 Commercial Digital Forensic Tools”. Some lists that I have seen have tools that are "free" but not for commercial use. This is a problem! If you can use a free tool for personal/home use, but use it for a commercial/legal matter, you could run into problems of violating the End User Licensing Agreement. Being a stickler for licensing details is not a bad thing to be.
So, if looking for an open source, registry forensics software, I would much rather find a list titled “Top 10 Open Source Registry Forensics Tools” instead of “Top 10 Digital Forensics Tools”. Even a list titled “Top 10 Registry Forensics Tools” will be a better list than a generic list.
Apples and Orange Comparisons
A few lists that I have seen which appeared to be specific, such as comparing forensic suites, have chosen incomparable tools. For example, choosing a non-forensic suite against a forensic suite for comparison isn’t a comparison of like tools. You can’t justify a comparison of a Ford Mustang against a semi-truck in a quarter-mile race, but I have seen it on a few occasions. A comparison of two different things with two different purposes is not a comparison.
Suite vs Suite
Anyone doing DFIR more than a few years knows that each forensic suite does something better than the other forensic suites. That just means that some things in Suite A are better than in Suite B, but some things in Suite B are better than Suite C, and some things in Suite C are better than Suite A (A>B>C>A>C>B>A).
Lists that compare suites and rank them generally as one being better than another may do a disservice unless the top tool does everything better than the tools lower on the list. I don’t find that to be accurate or possible. I have favorite suites, but none of any of my favorite suites do everything better than any other suite.
Getting drawn into a "Top 10" list of anything is easy enough, especially when the list name is so generic that it will always appear to cover what you are looking for. If you need a VSS forensic tool, and you see "Top 10 Digital Forensic Tools", well...that should cover what you need, right?
Poorly Titled Lists
Top 10 Digital Forensics Tools
Top 10 Open Source Forensic Tools
Top 10 Digital Forensic Suites
Better Titled Lists
Top 10 Open Source Registry Forensic Tools
Top 10 Commercial Forensic Suites for Email Analysis
Top 10 Open Source Suites for Internet Analysis
Is one better than the other?
Perhaps the most glaring problem with any Top Ten list is that of personal preference and bias . You can’t get away from this. If two tools can analyze email with the same results, why is one typically rated higher than the other? Personal preference or bias (bias can be a paid review or the reviewer personally liking one tool over another) might be the reason. If the list writer doesn't disclose bias in the list, then you have no idea of what puts one tool over another.
I am creating a regular series on “Top DFIR Software” lists, based on me actually turning on the software and honestly comparing tools against each other. Some lists may be “Top 5” or “Top 3” if there isn’t that many to review for a specific topic. Which brings me to the specificity of the lists…
There will be Suite A is better than Suite B for email analysis , and Suite B is better than Suite A for Internet analysis , and so forth. By better , I mean that as defined by a set of criteria such as (1) results orientated, (2) personal preference, (3) speed, (4) cost, (5) ease of use, and other factors that are clearly defined in the list so you know exactly why I rank tools in any order. If you know the bias and rationale of how a list is made, you can better judge the list and even reorganize the list based on your needs.
Your input is welcome on tool selection and opinions on tools to test and tools that I rate and will rate (via contact form and polls that I create).
If you didn’t catch Jessica Hyde on RallySecurity this week, you really should take a look. Not just to hear Jessica speak, but to catch the nuance that those who are not in “DF” might not really understand the intricacies of the work, even as they may be intimate specialists in the “IR”. Pretty much everyone on RallySec are extreme experts, and it is cool to see the areas each person overtly has expertise in.
Personally, I am a D igital F orensic person who has enough I ncident R esponse training and experience to know that I am first and foremost, a digital forensics person. That means I know where my boundaries of knowledge reside. My respect goes out to the IR folks who put out the fires and bear the brunt of attacks, breaches, and leaks. That is a tough job. As for me, I rather figure out who did it, how they did it, find the evidence to prove it, and let the full weight of justice bear down on the suspects. But that’s just me.
We still have to explain who we are and what we do, even to our fellow computer professionals. There has been more than one occasion where I have had to go into an IT department for a forensic gig, only to have some IT folks boast their knowledge of forensics. Typically, this hasn’t gone well, as most times the IT staff claiming to be forensic ‘experts’ were unable to admit they didn’t know anything about forensics, even though they were experts of their environment. The best IT know their limits, just as the best DF and IR do, and they don’t claim knowledge in things they don’t know. I politely get that point across when it happens. I know my job well. They know their job well. We don't know each other's job, therefore we work together to solve the problem.
Another point for those getting into the “DF/IR” field, is to know which side of the fence that you are aiming. I’ve taken a course or two that I thought were to be pure digital forensics, but actually were incident response focused. Not a waste of time, but I can see how easy someone can be looking at one goal but walking in the opposite direction. Be sure to take the training and degrees that you are intending to work toward. Details matter.
One of the biggest differences between the DF and the IR is the intended purpose of the work. Where IR is to stop the pain (stop the attack, seal the leak, etc…), the DF work is to find out the who, what, when, where, why, and how with the intention of legal proceedings . If there are no legal proceeding intention, then it really is not “forensics”, even as the actual procedures, methods, and tools may be the same. A firefighter doesn’t become a traffic investigator for saving an accident victim, nor does a traffic investigator become a firefighter for investigating a collision. Two different jobs. Two different skillsets. Two different goals (firefighters aren't typically looking for criminal evidence when performing CPR....).
Which is better? DF or IR?
The one that you like is better for you and the one that I like is better for me. I was never one to willingly run into a house fire when I worked patrol. I never had the misfortune to do so. I probably would have done so if I had to, but certainly I’d not work the job of a firefighter because sooner or later, I’d be running into flames. By the same token, I have had firefighters tell me that they have no idea why anyone would want to be a cop and handle domestic violence calls or bank robberies. That’s the thing. Different strokes for different folks. The same is true for DF and IR.
Begs the question…
So why is “DF” and “IR” slammed together as “DFIR”? The way I see it, the foundational knowledge is very close and the processes/procedures/tools are sometimes identical. There are only so many ways to image a drive, pull memory, or check running processes. Much is the same, but the goals are different, and eventually, drastically different. You’ll be hard pressed to regularly (if ever) see IR guys in court, just as you’ll be hard pressed to see a DF pro working on an active breach. I believe someone can be both a competent DF and IR person, but this requires quite a bit of work to be highly proficient in both worlds. Possible, but certainly picking one over the other will allow your skills to excel to a higher degree to be a specialist rather than a general practitioner. Just like any medical specialist is in the "medical field", we are all in the "DFIR" field.
I've not yet had the pleasure to meet David Cowen , but certainly look forward to that day to give him a hug. He has consistently created great DFIR content over the years and his latest video productions of a Forensic Lunch Test Kitchen is another win for everyone.
If you have not seen the Forensic Lunch Test Kitchen, I highly recommend it, not just for the topic, but also for the subtle clues you can learn from observing critical thinking in action. I am a big fan of figuring things out on your own, a huge supporter of learning how others do it (so that I can improve that what I do), and seeing how someone else processes infomation to make decisions which is most always different than how I would have done it. Not that one way is better than another, but that the more you know, the better off you will be.
It is important to consider that knowing " the " answer or " the " way to do something is only 10% of your skill. The other 90% is knowing how to figure out problems, or how to solve a problem using a different way of critical thinking. The best investigators, the best analysts, and the best problem solvers have one major trait in common. They think. They process. They evaluate decisions. They decide. The difference is in how they think, how they process, and how they evaluate their decisions. Everyone does it a little differently, some innately, some methodically, and to gain insight into someone else's methods can only improve yours.
Just some thoughts on “vendor” marketing.
In just about every DFIR email list, social media thread, or forum, there is the sporadic appearance of a vendor who mentions their software in response to a problem someone has, and within seconds of the vendor response, the vendor gets bashed for simply saying, "Hey, maybe my software can help."
I totally get it. I don’t know anyone who wants sales people knocking on their front door, trying to sell something that they didn’t ask for in the first place. Doesn’t matter if it is encyclopedia sales or vacuum cleaner sales, unsolicited sales can be annoying.
Anyone who has been to even one major tech conference quickly learns that if you let a vendor scan your badge in return of getting a free pen or toy, you will probably have emails sent to you for years by that vendor. The cost of that “free” stuff is agreeing to be contacted by the vendor. So, you kinda ask for it when you do this.
Yes, I get a few mailers. Actual printed materials. Some are done very well and quite informative in addition to selling something. I'll take free useful information anytime.
Back to the online vendor marketing….
First off, I like free stuff. I love FOSS . But I also buy things that I need and I don’t expect these things to be free. Yes, “ Name-Your-Forensic-Tool ” might be expensive, but it is expensive for a reason. It takes time and resources to develop and also incurs marketing expenses to get the word out to those who may need the tool. Basic business. If your business is selling your software, and if you don't sell it, it never gets developed further and your business closes its doors. Everyone loses .
Here is where I see a divergence in how some vendors are treated by some of us (I say “us” because we are all in this together). Some of the comments I've seen and not agreed with include;
-If the tool is so good, it shouldn’t have to be marketed. Sales should be organic.
-I don’t want to be sold anything.
-Vendors should not be able to comment on their tools in forums/email lists/etc…
-I am so tired of seeing marketing on social media (Linkedin, Facebook, etc…)
Here's the thing: if I never see a marketing attempt by a vendor, I may never see the tool…ever. I plainly will never know that it exists, even if I could use it to solve a problem..
For the email lists and forums, I have no problem in that a response to a problem could be answered with someone who sells a solution to that problem. Actually, a vendor with the solution should answer! That is the point of someone asking...they are asking for a solution. And if a competitor responds to another competitor, all the better. Now you can see competing products for your solution. You may even discover a solution that you never knew existed before.
If you take a look at the tools listing on dfir.training , you will over 1,000 individual software tools in DFIR. I am quite sure that you have never seen 75% of the listings before, maybe even more. There are probably another 1,000 tools that are not listed, which would account for those I have never seen before because no one talks about them (therefore…not listed…). I have no doubt that there has been some outstanding tools developed, both FOSS and commericial, that never see the light of day because there is no marketing. I can also imagine some software writers who simply gave up because they didn't market their tool to potential buyers. You can have the best tool in the world, but if no one knows about it, makes no difference in how well it works.
If someone has a problem to solve and publicly asks for a solution to that problem, then those who have potential solutions should feel safe in public recommendations, whether as a user or developer of a solution.
I believe the key point that many seem to have is that because a tool-maker makes money off the tool that they developed, they shouldn’t be allowed marketing their tool or allowed to chime in with a discussion about their tool.
As for me, I tune out the vendors that I don’t need, keep an eye out for tools that I might need, and keep up on the developers with tools that I regularly need. For the tool developers, if you don't see your tool listed on dfir.training, that means I don't know about it and really want to add it, so let me know about it.
I want to expand a little on David Cowen’s Daily Blog #442: Anti Forensic Tools in the Wild , in regards to terminology I prefer to use.
Like David said, we encounter data wiping in cases on occasion, sometimes on many occasions. Specifically, I mean the cases where the suspect/custodian has intentionally wiped files to prevent recovery by folks like us. Sometimes it works. Sometimes it doesn’t.
When I state in a report or give testimony that someone used anti/counter forensic software, I explain that the person intentionally used software (or hardware) in a manner to thwart forensic recovery or obstruct the investigation and analysis. I am specific when calling out “anti” or “counter” forensics activity, based on several factors.
Name of the software
One of the factors is the name of the software used to wipe the data. Dave’s list has a few good ones. “Evidence Eliminator” is a prime example of a software tool that gives you the intended purpose of the software. In this example, eliminate evidence . By itself, the name of a tool should not be the only measure of its use, but certainly a consideration.
Use of the software
The other factor I consider is the use of the software. When the name of the software clearly states it’s intended purpose (“Evidence Eliminator”) and the use of the software implies the intention of the user, then it’s easy to say that anti/counter-forensics was employed. One case I had found the custodian installed Ccleaner after a preservation order, ran Ccleaner multiple times using all types of settings, then uninstalled Ccleaner the day before providing the computer for analysis. Clear intention based on the type of software (and name), its use, and the circumstance.
However, when the software/hardware used has an intended purpose other than anti/counter-forensics, the user’s intentions need to show a bit more. For example, a command prompt is an innocent application; except when used to format a drive containing evidence or to delete evidence files. A person who formats an external drive using a command prompt (or DiskPart) after a preservation order, then denies deleting any files, clearly was employing anti/counter-forensics measures using an application not designed for malicious purposes. In this example, you can call DiskPart an anti/counter-forensic tool in this specific use of the tool. This is like a hammer used to build a house and also used as a dangerous weapon. The use and intention matter most, regardless of the type of tool used.
I like the easy cases of data wiping software used after a preservation order was served. The evidence may be gone, but showing the use of the software after the fact certainly makes for a better case anyway. I also like the more difficult cases where legitimate software has been used for bad purposes. This takes a little more time to show intention, but there is no difference between using Evidence Eliminator or a Command Prompt when the intention and result are the same. Both are anti/counter-forensics use of software regardless of the intended purpose of the software.
As far as the “anti” and “counter” forensics, I don’t really see a difference in the terminology, but you may prefer one over the other. Both are clear, in that they both refer to anything that someone has done to make your job more difficult. Some malware sorta works the same way. Legitimate software used for malicous purposes.
By the way, the best anti/counter-forensics method that I have found is something I call the “ Lake Washington Defense ”. One case matter that I had, the custodian kept throwing the devices into Lake Washington, or rather, “the laptop fell out of the boat” type of excuses. In those cases, the data is gone…like really gone. I testified that the 'lost' items were apparently anti-forensic methods employed to prevent forensic recovery of any data :)
The Digital Corpora was updated with some new forensic test images ! The list of forensic test images on https://www.dfir.training/resources/references/test-images-and-challenges/test-images-and-challenges/all is a very popular page as it links to multiple TERABYTES of forensic test images . So, when a new test image is added, it is quite exciting to start downloading to test your tools or use in a class, or just for plain practice.
This new set of forensic images include: cell phones, tablets, hard drives, and packet dumps! And it's not just random images of devices as they are all based on a scenario. Just as neat....you can download a teacher's guide with the images. How cool is that?
" The scenario was created during the summer of 2012 as part of a joint collaboration between the U.S. Naval Postgraduate School and the U.S. Military Academy at West Point." - http://digitalcorpora.org/corpora/scenarios/national-gallery-dc-2012-attack
The availability of forensic test images is one of the most helpful resources to learn forensics. You can practice without fear of making mistakes. Actually, you want to make mistakes and learn from them as these mistakes on test images help you grow without risking harm to a real case. When I started out in forensics (the days of the floppy.....), I ran around the office looking for every floppy I could find to practice. You can imagine my joy in finding a box full of used floppies, in the office supplies closet in which the floppies were to be re-used. Of course...I found lots of juicy files that were deleted and recovered with my trusty FTK v1.
Some DFIR tools are terrible…if not used correctly.
I saw an engaging discussion online about tool choices that inspired this post about tools. I particularly enjoyed how someone gave an example of how tools are referred. I changed the example a little, but the more I look at it, the more I can remember this happening all the time:
Person 1: What tool can I use for “X”?
Person 2: Use this one.
Person 3: No, this one is better.
Person 4: But I like this other one better.
Person 5: That one sucks. Use this one instead.
Person 6: Why don’t you write your own tool?
Person 7: What’s wrong with my tool?
Person 1: Uh...thx?
This is great advice if:
-The question included specific details of the issue to be solved, and
-The tool(s) recommended can do the task, and
-The user knows how to use the tool.
This is bad advice if:
-The question was poorly framed, or
-The tools suggested don’t fit the need because the need was not described accurately, or
-The user doesn’t know how to effectively use the tool.
I also see complaints online about some DFIR tools, like:
-Why doesn’t this tool do encryption?
-How come email is so difficult to do with this tool?
-Why is the reporting so bad with this tool?
-How come this tool doesn’t find what I need it to find?
-Who validated this tool?
The solution to all of this is simple:
-Clearly define your forensic problem.
-Choose a tool designed to handle that problem.
-Use the tool correctly.
The things to not do:
-Don’t use a tool without knowing how to use that tool.
-Don’t use a tool without personally making sure that it works.
-Don’t use a tool that is not designed to do what you want it to do.
This all sounds so easy, but with the wide range of software available, it is easy to be overwhelmed with choices. Sometimes we fall in love with one tool and want it to do everything, even things that it may not be best suited to do or maybe not even designed to do at all. Sometimes we avoid a tool just because we don’t like the interface design. And many times we use tools without fully understanding what they are doing, what they are capable of doing, and just as important, what the tools are incapable of doing.
-Clearly define your forensic problem.
* Which OS, artifacts, etc…
* Desired output (depth of analysis, reporting, etc…).
-Choose a tool designed to handle that problem.
* Round peg in a round hole (don’t force a tool to do what it is not designed to do).
* Updated, maintained, used by the community, good reputation, etc…
-Use the tool correctly.
* Read the manual and/or take a class in that tool, and/or ask someone for guidance.
* Test it.
* Use it as designed for the problem designed to handle.
I have found that tools may have the same generic name and claim to do the same generic thing but are actually extremely far apart in what the tools actually do. Without knowing the scope and limitations of tools, you can miss everything in an analysis and not even know it. Or you can miss something so glaringly simple as to discredit your entire analysis, just because you didn’t employ an appropriate tool or maybe didn’t use an appropriate tool correctly. To be clear, asking for tool suggestions is the best way to find what tool you need unless the question isn't framed correctly.
So, when I see questions like “What tool does this generic-thing-I-need best?”, I know exactly what is going to happen next…
SANS has a recent video on tools that has some pretty good info worth taking a look at:
Few clichés are more worn out than the tired “ think outside the box ”. I still stay it, but when I do, I say it to literally mean do not conduct an analysis solely within the physical box (CPU). Remember, everything that happens with data has happened because a person or persons made it happen. People do not live in a box. They live and operate in the outside world.
People are behind actions .
Every bit of evidence you find has a reason to be there. Someone made it happen. There was a thought, a plan, an intention, and an action to make it happen. For evidence that should exist but does not exist, this lack of evidence carries the same weight since it takes someone to make it appear as if it did not happen .
Sometimes your job requires fixing a problem (such as a breach) and make it so the problem has a less risk of happening again. In many of these types of jobs, identifying the suspect would be a waste of resources since there is no remedy to the problem other than reducing the risk of the next intrusion. However, if you are in the business of catching bad guys, then you need to literally think out of the box with your forensic analysis.
Identifying the Modus operandi ( M.O.) can help identify the suspect’s intentions and identification. You can find the M.O. through forensic artifacts. You can also find more than just the M.O., like traces of evidence inside the box that lead to clues outside the box, such as geographical locations or the actual names of suspects. Outside the box, interviews with potential suspects, even those who may lie, can give clues as to what to look for inside the box. Just as it is important to validate your forensic findings, it is important to validate (corroborate) investigative findings with other facts. Finding evidence on a device is great, but it is much better when you have information obtained outside the exam that corroborates the evidence on the device.
The way I look at it, when tracking suspects, I use whatever clues I have at hand to lead to the next clue. If starting with digital forensics, that means I want to use that what I find in the box to lead me to a person outside the box. To do that, I need to remember that a forensic analysis is just a forensic analysis. But when you couple it with thinking outside the box, you get an investigation to find your bad guy, and not just data.
The new-peer-review-no-name-yet task force is chipping away at the proposal of a new (but extremely different) peer review process for DFIR research, spearheaded by Jessica Hyde .
I’ve gotten a few private messages that teeter on the edge of complaints about even talking about creating a new process of peer review, but each complaint has been relieved of worry after clarifying what we are working to come up with.
Here are some of the things I want to clarify:
Who should this process appeal to?
Who does this process not target?
So, you can see that this is primarily, if not solely, intended to give DFIR bloggers an avenue to have their work peer reviewed and be recognized for their work . It is so much easier to cite a peer reviewed work in other research, so the community benefits as well. The blogger benefits by having his or her name stamped with the research without fear that another person or commercial entity will claim credit for the work that was done. At the bare minimum for clarity, the work that will be peer reviewed in this process is work that would have never made it into an academic review anyway. That is the audience we hope to support: the DFIR practitioner bloggers.
The target audience is not expected to be earth-changing in size. Maybe just a few a year will have interest in the beginning. Plus, not every blog post is something that needs to be peer reviewed. But the research that is new, innovative, or creative…why not?
What a time to be in the field of DFIR! If you have being doing this work since the days of the floppy, you surely must be as excited as me. If you just entering the field, you will see even more advancements in the future than your predecessors have.
But let’s get on with one of the most important topics that is making our skill levels advance more than anything else has ever done before: Instant documentation and sharing.
Many in the field have written (and keep writing!) and about the importance of sharing and documentation. Without getting into ethical questions in the field about sharing special discoveries, I want to talk about sharing generically, but specifically in the physical manner of sharing.
One of the biggest issues in our industry is the dearth of documentation.— H. Carvey (@keydet89) July 2, 2018
The Internet gives us so many platforms to share information that it is practically impossible to keep up on it all. You cannot “follow” everyone. Google won’t find everything. Some platforms won’t give you access (only specific groups of people can access, such as LEO-only), and some platforms are simply too difficult to keep track of the information that flashes across the screen only to disappear into the “blackhole of great information but no one saw it”.
For the DFIR info curators, the DFIR blog is the number one source of information , mostly because it is semi-permanent, easy to find, easy to bookmark, and most always accessible to anyone with Internet access without having to have a special account to access.
Other means of dissemination are faster to put out and faster to reach an audience. Twitter is a prime example of being able to send out a bit of information in seconds that can potentially reach millions instantly. The negatives are that most tweets are not well-thought out, lacking depth, can be deleted, and are quickly buried in seconds by hundreds of newer tweets. On top of that, if you don’t follow the tweeter and no one that you follow retweets the wonderful information, it is as if it were never typed in the first place because you will never see it. In all likelihood, there have been outstanding tweets of information that were so quickly buried that few people even saw them.
Social media platforms like Facebook and Linkedin are only a little better in the sense that the posts seem to last a little longer, but still are not going to be as in-depth as a well-written blog post on research. Worse yet is that viewers need an account on most of these services in order to be able to see the posts.
In between the sites like Facebook and micro-blogs like Twitter, we have Discord , Slack , and other chat services. Again, you need to be a member of the group to access, the information in many of these services fly by the screen and is buried in a blink of an eye. And to even know about one of these services is to be lucky to catch the info on Twitter or be invited to the inner circle through contacts you have.
Based on research on social media content's lifespan from http://blog.hcpassociates.com/how-long-does-content-last/ , consider how the following graph relates to the DFIR information that we share.
Length of time content lasts on various platforms
The books and journals throw this graph out of whack since the content in a book or journal is measured in decades . But let’s take away the books/journals. Here is what we get for content lifespan.
Length of time content lasts on various platforms (minus books/journals)
So here you can see that a blog’s content remains for about 2 years , but still, other social media is not even registering on the chart. After this point of blogs, information doesn’t last longer than hours or minutes ( 18 minutes is Twitter’s lifespan! ). This chart doesn’t even include chat services like Discord, which I would imagine has a lifespan of way less than Twitter, maybe even lasting only seconds.
Add to this the closed lists, closed forums, and closed chats to get the real picture of how much information does not reach the practitioner or is stored with any permanence. Couple this with the amount of time that anyone has to keep up on dozens of services and you get a very dark picture of how much we know compared to how much we could know.
Therein lies the issue. The faster the information is disseminated, then faster it disappears, or worse, is never seen. The slower the information is made available, the more people that have access to it, but relevance begins to fade over time.
* Substantial/important information should be documented at least to the level of a blog post. The tweets and chats should be short bits of content with reference to the blog posts.
* Blog it on your blog
* Guest post it on other blogs if you don’t have your own blog
* Guest post it on other blogs even if you have your own blog
* Publish it formally
* Do the above, plus..
* Have it peer-reviewed and published
* Make a video about it
* You can do wonders with a short, 3-minute video
* You can embed the video in your blog post, tweet, and update when necessary
Time is always going to be an issue to research and share. Many of us barely have the time research, or fully dive into something unusual we come across in our daily duties. To require more than that is a lot to ask, but it is not unreasonable to ask to share bits and pieces as you can.
One thing I can advise, is that if you don’t share what you find, someone else will find it. And someone else may share it and clearly take credit for something you could have taken credit for. If credit is something that drives you, you need to put your name on it. If you don’t like discovering something and someone else taking credit for it (when you never gave notice of your find), then you better share what you find. I’ve spoken to a few folks who have complained as if someone broke into their home and stole their research to publish, but in fact, who ever finds it and publishes it first is the person who discovered it, whatever ‘it’ happens to be.
As for me, these complaints fall on deaf ears. Put your name on it or someone else legitimately will (this goes for individuals and corporations).
I use Twitter in the event that I happen to come across something really hot that needs attention. But I also know that I miss 99% of everything that comes across on Twitter because I can’t live on Twitter. The same with Discord, closed forums, and such. The time it takes to log into a service and maneuver through it to find information is mostly time I don’t have. But I monitor blogs on a daily basis. There are hundreds of blog RSS feeds atdfir.training that I check daily (many times a day sometimes) to save time in clicking bookmarks to see who has updated what on which blog. Blogs last longer enough that I can check a few days later and not miss something. If you miss 10 seconds of Twitter, you will miss something. Phill Moore also saves me a lot of time with his blog which tends to catch things I missed.
So here's the point:
--Blog some more
--Tweet about your blog posts
--Let the blog curators (like me and Phill ) know about your blog to help it get traction
More on the Rapid Peer Review for DFIR blogs is coming. There is a special ops team led by Jessica Hyde on a mission to figure something out to benefit the community, the researcher, and just as important, the reviewers. One of the issues I see is that of ownership of current methods of publishing and peer reviews competing with several options of doing this differently. Mostly I dread someone taking a stance that their way is better, regardless if it is better or not even a competing issue at all. My personal goal is to help create a system in which those would not have published before, will be able to publish now, not to take away from anything that anyone else is doing. The point is to share, make it easy to share, and make it easy to use the shared information.
What started as a question on twitter, turned into a poll and twitter discussion, has begun to evolve into something interesting: The “ Rapid Peer Review ”.
I’ve had quite a few DMs and emails with several people over the past week on peer reviews in the DFIR world to discuss this topic.
In short, academic reviews take too long to publish and are of limited practical value for practitioners. We need a better system.
During these discussions, Jessica Hyde coined the “RAPID PEER REVIEW” name, so I’m sticking with that.
Since this idea is evolving, here are some of the ideas being discussed, all subject to change:
* Process should take 30 days or less to be considered Peer-reviewed or rejected
* Previously peer-reviewed work (as seen in a published journal) would be ineligible
* Previously written work that has been cited or referenced may be judged as already peer reviewed by virtue of being source material of peer reviewed work. Meaning, if you wrote something that was later used in books and journals, then your work was probably already peer reviewed by those authors of books and journals. This process would simply verfiy and validate your work as cited.
* Peer reviewers would be practitioners within high-tech organizations or academia
* Work that has been RAPID PEER REVIEWed could still be eligible for journal publishing (but not the other way around)
Here are the benefits to you :
* Your research is recognized .
* Your work get more exposure , not in the manner of becoming famous but instead your research is shared more widely.
* Your name gets credited for your work.
* The community has another source (a validated source!) for research to build upon and learn.
* Your ability to cite sources that have been peer reviewed increase, rather than citing someone's blog post (that is, if you cite for a book, journal, or legal matter that you are writing).
At this point, there are a few drivers pushing this idea along. I officially Knight Jessica as the cat herder. Eventually (sooner than later), after some details are fleshed out, it would be good to see a few more interested parties join in to help with the physical labor.
At any point, we encourage comments, suggestions, and recommendations at any time. Currently, we have some ideas on the who will peer review, where the peer reviewed documents will be stored, and how the process will tentatively work. After we whittle down a little more, I’ll write up the details of where it stands, which could look really good or need a total revamp…that’s where input will be helpful.
Who does RAPID PEER REVIEW affect?
One of the main points that the Rapid Peer Review process focuses on is that of avoiding the academia model at all costs. This is not to replace traditional publishing or scientific journals, or to compete with anyone who wants their research in a journal or book. This is for those who would not have published in a journal or book, or maybe is not ready to publish in a journal or book. This is for everyone who has written or will write some cool DFIR stuff that should be shared as a Peer Reviewed work (of art and science…).
What do you not get for having your work in the Rapid Peer Review process?
The intention is simply to be a bridge between a blog post and a scientific journal.
Brett's Peer Review Model
I have posted on this a few times, as well as commented on Twitter, but the short answer is: "We don't peer review because it is too much work and too much time spent with no real personal benefit." Our jobs are not publishing, but actually practicing the trade of DFIR.
Now I see another reason why DFIR researchers may not be publishing their work via the 'academic journals'.
I feel that DFIR had been doing it right all along. Practitioners work. They find something interesting. They blog about it. Then everyone else takes advantage of their discovery. And when it's really good, the practitioner writes up a Word doc, PDFs it, and uploads to the Internet. Now it is memorialized forever (or until the Internet dies). I had suggested that the DFIR community add one, little step between the PDFing and posting: community peer review. The reasons to add one thin layer of peer review is simply to validate the work that was done so that citing it becomes easier and the DFIR discoverer gets permanent credit. The community benefits overall.
After reading " Some science journals that claim to peer review papers do not do so ", I see that there are even more reasons to avoid the academic route to journals unless your job is in academics or you want to go into that field.
I agree that there is validation, credibility, and personal satisfaction in having an academic peer-review paper that is published in a journal, but everything that is required to do so goes against the very grain of DFIR work. DFIR research needs to be shared yesterday, not two years from tomorrow. The methods and artifacts that we discover are sometimes perishable, but certainly they are dynamic. The academic model for peer review doesn't work for DFIR research because it takes too long. In fact....very few practitioners read the scientific journals, and with that, the research will have been in vain.
My bias* as a practitioner is obvious, because there is no hurry in the academic world. The academic world does not deal with a breach where a business may go bankrupt in days, or where national security secrets are being siphoned out of a network, or where a child needs to be rescued after being lured online. Practitioners need the newest research as soon as it is ready ( ready to be put to use, not ready for the academic peer review process ).
As a matter of practicality, money probably needs to be involved in this process, because although I support working a job for fun, I do not agree that you should be required to work for free. How a business model needs to be developed for a non-academic peer review model is a topic that should be started sooner rather than later. The good news is that I see more than a few DFIRrs talking about it. Now that is cool.
*Side notes on my perspective and bias:
I have practiced DFIR in the public sector and private sector, and taught it in the academic world. I tend to see the importance of immediate access to research being more overriding in importance than a long-process of publishing.