
uh hey everybody welcome to our talk uh a drop of jupiter where we're going to show you how to use jupiter notebooks for a more modular and dynamic approach to penetration testing uh shout out to anybody that can get the reference there's a hint right above the jupiter logo if you get it drop something in the chat so here's a quick overview of our team on the next few slides we'll be going into a bit more detail into who we are and some additional information into what we do so who am i i am a security consultant at protivity i focus on netpen primarily um but i also do uh ocean and tooling research and as
well as operating as a amateur tool developer so i've made some stuff that's gained some traction the bug bounty community i'll talk about a little bit about more later on in the presentation but mostly again really happy to be here and shout out to the guys at b-sides for giving us this opportunity i'm going to introduce one of our partners in this cody first cody is a consultant at protipity he likes to refer to himself as a jack of all trades pentester he helps us with quite a bit of everything a little bit of web app a little bit of netpin and he also specializes in mobile application testing and he also prefers to automate report
writing uh who am i i am nate kirk i am a senior security consultant at protiviti i specialize in network penetration testing and red teaming i am also the infrastructure and tooling lead for the pertudi attack and pin practice so i actually worked with omar on integrating this into our infrastructure in our day-to-day and also we'll talk about that later down the road of how you can do it too i am also a former sysadmin and blue teamer so just a quick refresher um if you are familiar with jupiter or if you have no idea what it is um jupiter is essentially um it's a non-profit open source ipython project ipython is a command line terminal for
python historically jupiter's been primarily used for data science but what we're trying to do is we're trying to pivot that and show you the applications that can be used for penetration testing specifically quick overview again of some of the functionality and what it is it is a platform where you can have parallel documentation and execution at the same time it is a simple and collaborative file repository so imagine combining like your onenote notebook of commands scripts etc while also being able to execute them at the same time so there's no more copying pasting or referring back to any of your past history you have everything in one place for your documentation and execution and additionally so there's also data
manipulation and reporting abilities for example if you have want to integrate matplotlib or any other libraries or if you want to use a r kernel for statistics or developing reporting metrics that functionality is also there it's also accessible via a browser which we'll touch upon more in the coming presentation but before that i wanted to quickly kind of give a a brief overview of why jupiter so the idea for creating a jupiter notebook kind of stemmed from when i was starting out as a junior penetration tester i found myself constantly flipping through notes maybe like writing long bash scripts and trying to get things to do them where i really didn't have the knowledge and the know-how
i felt like this wasn't an ideal learning environment for for new testers as well that were coming in so we came up with the idea to create a jupiter notebook that had the the commands and the documentation for for newer testers and as well as for for current testers to automate and be able to standardize their methodology and so jupiter in a nutshell again it's a simple and easy to use platform it's a better method to facilitate the the training and project execution of an assessment at the same time it provides a collaborative environment as well as management oversight and there's currently a large opportunity for involvement with penetration testing activities and changing things so that it's not just
used as a data science framework but more so as an offensive security framework in the future so how does jupiter work jupiter essentially it has a active kernel so it enables continuous testing uh that active kernel that we're using for our demonstration is written in it's a it's a python kernel but there's over 100 plus kernels that you can use so you could change your kernel to ruby to to go to javascript there's a lot of possibilities again and that kind of contributes to the extensibility of jupiter itself it has code and markdown that is inputted in a modular format so for example code is inputted into cells markdown is put and put it into cells
and those cells can be interchangeable throughout the jupiter notebook for more modular testing it operates off of a server client infrastructure that nate will touch on later on in the presentation itself and one of the coolest things i think is the the combination of code that you can use so for example you can combine python and bash together to create really modular scripts strong and powerful complex functions and it really opens up a wider array of possibilities for penetration testing especially when you have a jupiter instance running on your cali machine so i'm going to try to go over a quick few examples of some of the highlights that i mentioned so for example with code and markdown
you can see this is a very brief and stripped down i guess overview or demo of what a notebook could look like so you can see at the very top it says this notebook introduces some python concepts so that's an example of a markdown cell beneath that there's some python code where you'll see print hello world and you'll see some numeric functions happening and you'll also see the output so jupyter is it operates within these cells so you'll have a cell markdown you'll have a cell of code and those are typically the two main cells for functionality in a jupyter notebook here's a quick example that again nate will touch on a lot more i just wanted to show you guys kind of
what the server client infrastructure looks like so currently i have a jupiter notebook running on my cali machine serving it up and i access it via a web browser using https and a port that i specified you can also set a password which i would strongly recommend especially if you're working in teams or if you have sensitive data inside of youtube or notebook so to iterate kind of what i was talking about the the python or any other code really combination method so this is a quick example of how python and bash can be interactive together inside of a jupiter notebook so you'll see that there is a mix of both python and bash scripting
as you can see i've created a a list variable in python and inside of that list are contained six strings so in the next two lines of codes i've used a loop to iterate across that list and instead of using a python print statement i'm passing the variable i set in python to bash and using that bash command to to perform an echo to output to the jupiter notebook so this is a very simplified version of what you can do but i hope that it illustrates at least the possibilities that are available while using a jupyter notebook for penetration testing so some key issues that we've we've noticed and kind of the reasons behind why we wanted to create a jupiter
notebook for penetration testing or performing penetration testing related activities does that current frameworks such as pen tester framework or any other methodology that you might be using maybe you have everything just dropped into a simple bash script but other frameworks lack the extensibility the the modular nature the documentation the markdown and the collaborative nature that jupiter notebook has to provide additional functionality within teams so what jupiter is better at than other frameworks it's has a specific set of documentation magic commands notebook extensions and it's very easy to use it's there's a gui it's not something that's necessarily complex or difficult to start off with for for newer penetration testers or penetration testers with more skin in the game
and like i mentioned before it's collaborative in nature due to the nature of the file explorer anybody can can download a file upload a file edit a file as well as a jupyter notebook itself and some of the the issues stemming from that are the current environment of how uh some of our penetration testing activities are conducted we see that there is a constantly changing landscape of new tools and methodologies so this makes it difficult to constantly update frameworks it makes it difficult to share these framework updates or tools or anything new that's popping up in the community and as a result we kind of see that there is a a lack of a truly collaborative and
modular framework for pen testing itself so combined with a lack of standardized methodology within teams there's also a resultant factor of documentation sprawl within teams especially for newer pen testers it makes it really hard to kind of hop into the game um a sprawl of commands scripts notes that are usually kind of just littered everywhere so having a centralized repo repository really helps with that um and we will start to reduce another issue which is kind of the the barriers to entry for for newer testers
so our solution it's relatively simple i hope that y'all have guessed it by now but we want to utilize jupiter notebooks to develop modular and standardized automation frameworks for the execution and documentation of core penetration testing activities and what we'll do to kind of illustrate that is that we'll go over into a a demo of some ocean of an ocean notebook that we've created and it's it's a bit of a stripped down version of what we use um in the lab itself but nonetheless it's a good explanation of the functionality that jupiter can be that you can leverage with jupiter to perform key activities excuse me so here's a quick overview of what we're going to be covering
and the functionality that the jupiter notebook will conduct so on one side there will be sub domain enumeration uh then email enumeration and vendor service enumeration um so there will also be some additional functionalities happening in the jupiter notebook that we will work through live so after some of the data is gathered there will be some formatting verification if a domain's alive or not and then ultimately at the end parsing a deliverable that management can can can look at due to the collaborative nature of jupiter or for anybody to pull down and so we'll go ahead and start the demo
okay and so just to give a quick overview as to kind of how jupiter is structured and maybe make some of the stuff that i was talking about sound more practical so jupiter like i mentioned it has a file repository within this file repository i've created a folder called demo within this file repository you can see and view all the running jupyter notebooks so you can see how some of my personal ones that i use for for testing or for bug bounty or things of that nature and there's also like i mentioned extensions that you can use to to change how your code is displayed reporting features there's a lot more than what's being displayed right now
there's a lot of functionality that you can add with these extensions but for now we'll try to keep it as as basic as possible so moving on into the the jupiter notebook itself um i'm going to go ahead and kind of just walk through some of the additional functionality that i showed so jupyter has a a file tab you can open you can check the version history if you're using it for collaborative reasons of how the jupiter notebooks change similar to a google drive or something of that nature you can download the notebook and there's additional functionality within the cell running functions so you have again like i mentioned you have code cells like how you can see in the the start
time that i'm putting and you'll see that i've imported a i've imported the date time and i've printed out the date time as a as a start time and again some of the the data right here is not accurate but i'll clear and then hopefully they'll give you a better idea of how the jupiter notebook runs so i'm going to go ahead and clear all the output and now you'll see no outputs coming from these tools but as you can see i have some markdown and then i have some some code cells so in one of the cells there is a there is a setup script or a few cells of setup that are that are run during the the jupiter
ocean notebook so for example in this cell right here i'm setting all my my variables to be passed throughout the notebook so in python i'm saying that the domain is productivity.com i'm saying that the client is productivity and i'm creating a folder name called demo to store all the outputs from the notebook and here again is kind of a an example of some of a combination of bash and python scripting kind of like i mentioned before where i'm performing a host lookup on the the python variable except i'm doing it in bash and then i'm storing the the output from that bash or grepping that ip address and storing it back into python variable so i hope that kind of illustrates the
the dynamic nature of uh of code within jupiter so i'm going to go ahead and run this cell block and this is a relatively easy and i've already created the folder so you'll see an output that says the folder cannot be created which is totally fine and then i'm going to print out all the the variables that i just set so i can show y'all kind of how it works and now you can see that the client name has been set to partitivity and so on and so forth another core portion of the the setup script is the tool initialization script which we don't have to go into too much detail again it's relatively straightforward
essentially all it's doing is it's it's trying to create a modular framework for for me to to reference tools in the notebook so for example instead of specifying the whole path of a common tool used in osint like subfinder or sublister it will just perform a locate and then store that path into a dictionary so i can reference it throughout the notebook and so i'm going to go ahead and run this script to initialize all the tools that i'm going to use in the notebook and you should start to see some output so you'll see that all these paths have now been set into a dictionary now going back to kind of the overview that i showed the some of the one of the
first things i wanted to show is our subdomain enumeration process again this is a bit stripped down because of time constraints and the nature of the the the demo so i kind of wanted to show you guys how a standard tool like subfinder would look like being run inside of a jupiter notebook so all i have to do is the command is already written in there if i'm a new tester and i have i've never run sub finder before i can read the markdown description of what it does and all i have to do is click run inside of the cell block and it'll start to enumerate the subdomains for me inside of the notebook
itself so it mitigates the the need to to really understand i guess terminal commands in general of course that's something that's important but for for training and automation purposes this can be something that's really useful during assessments so subfinder will run once subfinder is completed then the the next tool will run the next tool is called uh asset finder it's written by by tom nom nom and similar to subfinder it performs the same functionality but typically when doing ocean you want to kind of spread your your your landscape a bit to see and use multiple tools to get as many subdomains as you can so i'm going to go ahead and run the asset finder
and now that's going to run and similarly to subfine you'll see that the output is is coming out in real time towards your notebook now one thing i do want to show is that i'm performing a t in bash to store all these domains into the notebook itself as a separate file so i can go to my folder or anybody from my team can go to my folder and they can go into to besides demo for all these purposes and some of these are already run so but they'll they'll reiterate and overwrite as we go through the demo but this domains.csv you can see it was just modified seconds ago it's the one that we just ran
so i can go into it and i can see all the subdomains from asset finder and subfinder outputted together i can edit this i can i can send it off to somebody anybody can pull it down so this really shows the collaborative nature of jupiter itself um but of course a common problem is that you have all these subdomains right but i would need to dedupe them so within jupiter itself i have a small script to format and dedupe those subdomains so we'll just take a look at this again i have about 870 sub domains not unique but i want to sort them and get all the uniques from them so in jupyter i can run the bash script that i
wrote or anybody can run it regardless if they have batch testing experience and it'll give me an output that i wrote in python to let me know it's complete with the print complete statement and i'll go ahead and go back to domains refresh and i'll see now that i'm left with 344 unique subdomains right so again i hope this illustrates the the ease of use with this you're not typing commands you're not copying pasting from another notebook or from your notes you have everything in a centralized repository for testing and everybody can use the same standardized methodology for testing that you are and i'll go ahead and give a quick overview of some of the additional
functionality inside of the notebook and then i'll go ahead and do a a run all so you guys can kind of see how long it takes for the for the automation to run for all these commands to run and things like that but in a nutshell kind of like i showed in the overview before i'll be performing some additional commands like verifying what domains are live with this bulk reverse dns script i'll be scraping emails via the infoga tool i'll be looking at showden which is going to take all the the ip addresses from the subdomains automatically and then put into showdown for running and searching on various queries i'll also be looking at cloud
enumeration to see what cloud assets are available we'll also be pulling from a breach list to seeing what information is on the breach list for the domain that we're using which is protivity.com um of course that information will be redacted and deliverable so there will be no passwords or anything of that nature and then a personal script that i've developed called get dorker which essentially what it does is it automates the the dorking of uh github dorks uh on a specific domain so for example if you put in protivity.com it would run a query on password connection string etc etc and of course all that information is already redacted and more so it's the only results that you'll see are
from my own github repository that i use to kind of show fake results so um and then towards the end there is a a parsing script that i wrote that will parse all the outputs from this notebook into a clean and nice excel deliverable so you can use that as a base point you know i'm not saying that this is like a perfect method for ocean but you can develop a very strong base to work off of regardless of your experience by using joker notebooks and by collaborating with people and creating stronger jupiter notebooks due to the modular nature it's a very easy base to to build off of and improve on in the future
so i'll go ahead and and run the jupyter notebook in full and the way i do that is i will start at the top and i'll click sell and run all so now the the jupiter notebook will start to to go through each of these cells so you'll see the start time printed and this is just something i wanted to put in just to show you guys how how quickly the jupiter notebook can run so it started at in military time 1545 and our time 345 and 37 seconds you can see that it's it's run through the setup script so it set the domain variables and it's run again through the tool initialization script and i'll go ahead
and minimize this and it's currently running through through subfinder and it's going to run through asset finder and go subsequently throughout these these different um these different scripts uh that were personally wrote and also brought in from from third-party resources or tools that were developed online so again there's a lot of functionality within jupiter to to put in your own customized scripts with python and bash scripting to create a really truly modular framework for for pen testing itself and so as we go through we can see that now the the email gathering portion has started and then after that showed in we'll start taking all the ips that were automatically uh grabbed from the dns probe
and then so on and so forth through until all the tools are completed so while we let this run we can go back to this but i want to kind of show you guys a quick heads up into what the the deliverable itself will look like so if you can see here on my screen i have a an excel deliverable that would be parsed at the end of this notebook so what i have is i have all my alive subdomains unique and sorted a through z and the corresponding ip which makes it a lot easier for a tester to just view and be like okay i know what's live i know what the test i can create a new column for comments
uh check what's there check what's not there and standardizing that methodology so we're not running into inconsistencies while we're testing within teams and again here you can see the cloud assets that printed or that outputted from the the cloud enumeration tool that was used you can see a some of a tab for ips only if you want to do some further testing on ips themselves you can see the the showdown output of all the ips that were queried against and you can see the various ports of course the results of those ports and so on and so forth and then you can also see the tool that i developed git dorker where it will output the the dorks themselves
the specific url to hit to access the the dork so you don't have to put anything yourself and the number of results for you to filter and search through and then here's the breach list from the the command that we ran to pull and breach this data from dehashed as you can see everything's redacted for for for obvious reasons so there's there will be no stealing of passwords here um but i did want to show you guys the the functionality and the ability to to pull things large data sets relatively easy uh easily with uh jupiter and then finally as you guys saw before i switched over there was a a script running called infoga
to pull email data and so this is already cleanly parsed and scraped from infoga and output into one clear deliverable so again there's no any there's no additional actions that i'm taking at the end of this notebook all i'm doing is i'm clicking sell run all and then i'll have this whole entire whole entire excel sheet parsed in a clean and nice format for me to to continue on as a base for my testing and so let's go ahead and go back to see uh the osun itself so you can see that it's it's currently running the the showed in script we can check back on this uh later to see kind of like the the time and how
long it took for it to for it to take um typically showed in will be the the longest portion but as you can see the rest of the tools they ran in within seconds but because of the large number of ips for the domain showdown might take a little bit more time so what i'll do now is i'll go ahead and hand it off to to nate for the infrastructure piece okay so as omar was going through the demonstration there i'm sure a lot of people are thinking what is the underlying infrastructure look like for something like this i was thinking the same thing as we're actually going through this and creating and experimenting with things is you
know what kind of horsepower do we need under the hood and what is this going to look like when we roll it out to an entire lab so luckily jupiter is actually pretty lightweight and a lot of the tools that we're using are also very lightweight so we actually deployed this at first just on a t2 micro moving up to a t2 medium in aws at first so it's really accessible to anybody and even in the demonstration that omar did it was actually running on a local cali instance so we prefer to use the cloud due to some of the collaboration ability with it but this is something that anyone could run pretty much anywhere on any flavor of os that they
want with some as long as you're hitting those requirements for jupiter itself so talking about aws we really enjoy it because of the you know being able to stick something and really not just aws but any cloud provider uh being able to stick it on a public address and then actually you know restricting the security group and network acl down to just letting certain people or certain lab ip addresses or vpn addresses to get to jupiter itself or really just to get to anything of ours so by restricting that but still opening it up to everyone you know i'm able to jump on from my um vpn network my you know productivity vpn network omar's able to jump on from his in a
different lab so i'm able to jump on do something omar's able to jump on and actually pull down that deliverable or or go checks how osint's going on one project it's made it really collaborative from that point um you know so for us we prefer to have a new instance each time for each new engagement so there's no kind of data contamination crossover if a tool breaks we're not worrying about it and it also kind of opens up experimentation for people of hey if you want to add a new module to jupiter feel free to go ahead if you break it we'll just spin you up a new instance and we're actually doing that through
just bash scripting that's how we actually deploy jupiter to cali instances it's just a bash script that goes and pulls it down pulls our configuration file which omar's talking about you can customize ssl put a password on it change the default port do all kinds of cool stuff with it so really it's accessible to anyone that that wants to do automatic installs if you want to it's fairly simple we're actually running it in a screen we've had a lot of good experimentation with that and i've seen it proving out also since our instances are disposable we haven't had too many latency issues or issues with that one of the great things about jupiter is the extensibility with it and some of
the kind of crazy use cases you can use it for and also the thing that i really love about it is with the markdown you can actually be teaching people as they go if it's their first time running something or the first time they're ingesting data into it is really impressive you know there's so many different forms and tasks that you can actually integrate in with jupiter itself you know just an example is bloodhound so you have all that data but you want to actually do some cool custom queries and you're doing at the same time on each engagement but you want to automate it engagement to engagement make sure it's consistent throughout instead of
actually querying neo4j or the backend database yourself you can actually just add that as a module into jupiter and have it do it for you and have you know an explanation before it and then have it run and then have it deliver a nice deliverable output that can help you with your hunting you know other things like that with offline data would be you know doing active directory reviews doing azure or other cloud provider reviews or even o365 reviews where you're pulling in large amounts of data even raw data into a database and as long as you have an ability to actually query it either through bash like the command line itself or through an interface of a database you can
actually automate that through jupiter and still make it you know one at a time explanation and make it pretty simple and concise also the reporting as omar showed with that excel sheet output really helps with you know kind of digesting and even making a formal deliverable for a client or even for yourself if you're trying to stay organized doing bug bounty hunting and trying to keep everything sorted out i know mine's sometimes a mess and notes this has really helped keep it condensed and keep everything in one central spot so really just to sum things up you know if you can script it in any programming language and you can do it in like a formatted or
formatted and markdown jupiter can handle it and you know the beauty of that the modularity with it and also keeping it a simplified format uh you know combined with its functionality of connecting it to like a cali instance and having your custom tools on it or just pulling in new cool tools from github the possibilities are really endless on this awesome thank you nate so before we go into some code release stuff i think what we could do is we could go and check back on the jupiter notebook and see how the runtime is i'm positive it finished probably along the same time when nate was talking but we didn't want to you know put a stall in the presentation so just
give me one second so picking back up from where we left off uh we were leaving off at shoden so we can see that showdown or a script for showdown and scraping is running to query all these ips that we got from our reverse dns and it's already finished it gives us some output saying that it's stored as shown in that's the name of the the output file some vendor service enumeration was conducted using a tool called cloudy new and it's given us these these buckets that we saw previously in the the excel deliverable and then a pull for for the dehash breach list which we also saw from the redact information and then again my personal tool git
dorker it it runs automatically inside and you can see all the the the terminal output as well showing you various links you can click on or for probably an easier uh experience you can go into excel format everything how you want to do it sort it and so on and so forth and again you'll get output telling you where it's been stored so i can go into the the b sides demo folder and i can see all the information that's been stored individually so if i want to go into emails i can see emails i have full right to edit this if i want to add something i can add my email in here right it so it's relatively simple i can
save it so if i click out of this and go back in here you'll now see my email um and so on and so forth for the rest of the outputs uh the beauty at the end like nate iterated was that there's also a parsing script that runs at the end that parses all the csvs that were outputted into one excel deliverable which we demonstrated before so it's relatively straightforward again you can modify this as you'd see fit but we'll touch on code release once that comes out and towards the end we'll see that the the the jupiter notebook itself had a an end time of 1556 or 356 our time so we started at 3 45
so you had this whole excel deliverable parsed and ready for you within 12 minutes right which i think is it's a great base it's a great start especially for somebody who has no idea what pen testing is or bug bounty hunting or wants us to get a start to their ocean this is a perfect way for them to get started remove those barriers to entry and have a good base to start off with in a relatively short amount of time remove all the the wasted hours from training and also expedite assessment time during penetration tests especially for the corporate environment so now we'll go ahead and go back to the presentation itself y'all see it
so talking about code release so the the release context uh the jupiter ocean notebook is formatted like the rest of jupiter notebooks and the dot i pynv format so it is run within the jupyter platform uh release date we're planning to release within one week of the time that this presentation comes out the release locations will be in my personal github so you can find that github.com obeta12 as well and i'll have some information coming out regarding blog posts and how to really perform jupiter modifications or creating your own jupiter notebooks through my twitter which you can see it it's the same handle o beta12 um and then lastly just some follow such contact tests i already mentioned mine
but nate you can follow nate at naderang at twitter but aside from that that that concludes uh our presentation again super happy uh to be here really thankful to all the guys at b-sides that made this possible especially with the the hassle uh with everybody being remote and the craziness that is coveted so shout out to those guys and now we'll be taking any questions if anybody has some