
uh so i looking forward to hearing this uh we've got hardik parekh who's going to be talking about navigating devops uh security journey into a wasp sam um without further ado i'll i'll hand it over to verdict
hi everyone in today's devops era there is increasing pressure to speed up time to market to meet this demand organizations are relying on increasing number of tax tax deployment models and open source software without clear security requirements it would be an understatement to say that modern software is growing in complexity as a result 75 percent of vulnerabilities are application related yet there is no prescriptive and measurable way to analyze and improve software assurance posture for organizations to address these challenges we recently released oasam 2.0 my name is hardik parak and i'm here to share my knowledge about how to navigate devops security journey at scale with os operations maturity model 2.0 before we do that i wanted to share a
little bit about myself i started my career as a software engineer like many of you guys and i got an opportunity to lead various security teams and programs at medium and large corporations like dell emc rsa intuit amazon and splunk i'm also on advisory boards for a few security and data privacy startups and a non-profit trade organization comptia which issues id security certification such as security plus and surplus throughout my career i contributed to industry through various safe code con publications science topic 25 programming errors cbss version 3.0 bsim since its very first version and os sam as well as mostly recently missed uh ssdf for those of you who might not be familiar with bsim bcm is another
software security maturity model which is descriptive in nature it was developed by sagittal by interviewing six independent software vendors from safe core founding member companies and emc was one of them when i was leading a product security team there it is more of a descriptive model of what activities other security organizations are performing at each maturity level versus prescriptive model which sam is so the difference between bsim and sam is mainly while bsim is more of a descriptive model sam is more of a prescriptive model which tells you exactly what you should be doing at each level of maturity while building your software security assurance program i started contributing to osm back in 2016 and have been co-author and one of
the core members of oas sam project ever since then now before we start the talk i would like to get some legal stuff out of the way i'm not speaking on behalf of my current or previous employers nor am i here as a representative of my current or previous employers my opinions are solely my own and they do not reflect those of my current or any of the previous employers a quick agenda i assume many of you might not be familiar with was sam so i'll spend some time introducing was sam covering what is sam why do we need a model like sam cru principles of sam and the project history after that i'll go over changes we
introduced in sam version 2.0 or 1.5 and last we will discuss how to apply sam in your organization to navigate devops secure journey at scale to build a software insurance program now if you have one of these 15 plus roles you'll be able to take valuable information from this talk and apply to your organization starting from monday now let's start look at what is sam osm is one of the flagship os projects flagship status is given to projects with strategic importance to both wasp and appsec in general was sam is a framework for software assurance that provides effective and measurable way for all types of organizations to analyze and improve their software security posture that is tailored to specific
risk facing that particular organization sam is full of useful resources that will help with evaluating an organization's current security practices providing recommendations or suggestions for growing and maturing those practices providing a way to demonstrate concrete improvements over a period of time and defining and measuring security activities throughout the life cycle one of the big benefits of sam is that it is a vendor agnostic sam can be done in-house or you could have one of the several apps like consulting firms help you with the assessment and creating plans and roadmaps now why do we need a model like sam as we discussed earlier in quest to increase speed today's organization is growing in complexity without any clear security
requirements which result in choose almost 75 percent of vulnerabilities are application specific now if you look at latest security breaches like equifax or a capital one they are more application related they are not parameter centric to standardize security activities in such a complex software environment we need a model like wassam i like this quote from joy box who has been called one of the great statical statistical minds of 20th century the most that can be expected from any model is that it can supply a useful approximation to reality all models are wrong and some models are useful i repeat all models are wrong and some models are useful the point is that you can't find a model that is exactly
describe the reality there are too many variables in real life but you can have a model that is close enough to be useful and close enough to reality and that is what sam is aspiring to be now sam was defined with flexibility and versatility in mind so that it can be utilized by small startups mid-side organizations as well as large business corporations using any style of development methodology be it agile devops waterfall it is a development methodology agnostic additionally this model can be applied organization wide for a single line of business or even an individual project sam is both measurable and actionable it defines maturity levels across business functions and provides clear-cut pathways for improving the
maturity levels thus providing step-by-step navigation plan to achieve higher levels of maturity now let's look at the project history for those of you who might not be familiar with sam the very first version of sam was originally actually created through an open sam project laid by praveer chandra an independent software security consultant after a number of years it being in kind of dormant state in around approximately 2015 a small group get together at wasp and work together to breed some life into this project and became oas project version 1.1 of osam expanded and restructured its predecessor into the four complementary resources the core document the how to guide the quick start guide and a toolbox
which is kind of like a spreadsheet that provides simple automation for data collection matrix and graphs back in 2017 we released version 1.5 of sam which incorporates a refinement of the scoring model to provide more granularity to the existing scoring assessment we recently launched launcher sam 2.0 in february 2020 where we have changed the measurement model one more time to add qualitative measurement to represent how well an organization is performing a particular security practice in addition we made several structural changes to sam as well as a result sam 2.0 is not backward compatible so for the folks who are using sam 1.5 sam 2.0 is not backward compatible please keep that in mind sam is built on a few core principles
the first one is organization's behavior changes slowly over time changes need to be smaller and iterative to really take hold and make a real difference from security perspective the second there is no single recipe that works for all organizations sam is built with this in mind and supports an organization building a program that is tailored to their risk profile their culture and their existing level of maturity when it comes to security the third guidance related to security activities must be prescriptive many times security initiatives fail due to poor details lack of communication or invalid assumptions overall the success of the program will be based on being simple well-defined and measurable let's take a look at maturity levels and
the assessment scores and how do we calculate them sam is defined in three levels of maturity ad hoc provision increased efficiency and comprehensive mastery at scale mainly through the automation unlike others maturity models such as bsim uh and sam predecessors itself there are four levels of assessment scores for each security activity which make fine grain improvements even more visible no implementation implementation across few or some projects implementation across at least half projects and implementation across many and most projects now not everyone needs to make level three in all areas similar to six sigma or cmmi the goal is not to max out on each and every practice honestly that wouldn't even be a good use of your limited resources
we all struggle with limited budget and limited resources whether it's a small startup or large corporation so in that environment we do not recommend to max out on each and every area and achieve level three especially not in the first attempt what the target maturity should be for your organization is largely up to you and that depends on business drivers and risk that your organization is facing for example if you are in a financial space there are different risk profile if you are ecommerce website there is totally different risk profile and if you are holding customer data there is a very different risk profile but if you are none of that then your risk profile will be very different your
business drivers are different some companies their business drivers are also focused on certain compliance aspects and all of this various formations they kind of like play a very important role in setting aside your target maturity level in 1.5 we have modified the scoring model to provide multiple choice answers to allow for more accurate assessment previously in sam and most of the models the questions were yes and no which is great from academic perspective but in real world we know that answer many times lies in between the two for example when you try to answer the following question for education and guidance at level two of sam assessment are those involved in the development process given row specific security training and
guidance you know you have trained some of the developers and would like to train some project managers and testers but given the question would you answer yes or no the dilemma here is if you answer yes then you will have challenges getting budget for your next iteration because you already said yeah you're already doing this so there is no incremental value that you're going to bring at the same time if you answer no it's not correct because you already trained some of the roles not all of them so that is where we introduce this more fine-grained measurement level for 1.5 now based on user feedback there are still some areas where we wanted to improve
sam from where it used to be so as a result of the feedback from the community we made changes to the measurement model yet again in sam 2.0 we are assessing the activities along two axis coverage by means of questions and quality by means of mandatory criteria uh i don't want to go too much into detail on this slide this is just a quick snapshot of sam 1.5 so you understand what changes we are making in 2.0 now sam is defined in three different levels at highest levels and 1.5 defined four critical business functions each business function has three security practices and each security practice has three levels of maturity now one would ask what are the
motivations behind a new version there are five main motivations behind coming up with the new sam version first align with recent development methodologies such as agile and devops to make it development methodology agnostic version 1.5 and its predecessors look more suitable for waterfall development methodology even though it was not meant to be so we realized that it's missing key aspects of guidance in terms of how to securely build and deploy software especially since ci cd has become an integral part of agile devops methodology second motivation is to improve the measurement as i discussed earlier even though 1.5 addressed the feedback about granularity levels it still didn't address the question how well an activity is being performed
thus needing some qualitative measurement criteria the third is to avoid orphaned or unrelated activities there are quite few secret activities in stem 1.5 that were defined such that it lacked consistent theme across different levels of maturity within a security practice which also resulted in a few orphan activities such as court stunning so it was funny that core signing was at maturity level 2 but there wasn't anything related to quote signing at metro level 1 and there wasn't anything at mature level 3 for code signing thus it felt like quote signing is kind of like more like an orphan activity dangling at level two the fourth motivation was to arrange maturity levels in increasing order of
difficulty other shortcomings of 1.5 and its predecessor was that sometimes security activities at higher level were easier to implement compared to the security activities at lower levels which really didn't make sense the fifth and the last motivation was to improve the production process sand production process was itself slow and waterfall resulting in major overhaul every time we wanted to create a new version as a result we felt the need to come up with a new version and hence sand 2.0 came to light now this is sam 2.0 framework at very high level and the areas highlighted are changes from sam 1.5 which we'll cover in a few minutes send 2.0 is defined in three different levels at the highest level
sam defined five critical business function instead of four each business function is a category of activities related to the nuts and bolts of software development for each business function sam defines three security practices and each secure practice is an area of security activities that build assurance for the respective business function for each security practice sam defines three maturity levels as an objective and each level within a security practice is characterized by successful successively more sophisticated objective and even more stringent success matrix than the previous levels overall as you increase the level of maturity you should expect higher cost of implementation if you look at the framework you can see that governance is more focused on the
program itself looking at the more strategic elements strategy and metrics policy and compliance education guidance etc we have renamed construction business functioning to design and introduce a new business function implementation now design implementation verification operations they covered the whole software development lifecycle design is focused on thresh assessment and security requirements during earlier phases implementations focused on secure build secure deployment and defect management aspects verification is more focused on testing and verifying aspects and operations is focused on incident detection and management and environment management where the app lives on sam 2.0 core model when we look at the little bit uh closer look at the center auto core model there let's look at some of the security practices each security
practice is divided into two streams stream a and stream b the purpose of the streams is to align and link activities within the practice or different maturity levels each stream has an objective to be reached and this objective can be reached increasing level of maturity this way we ensure that there are no orphan activities dangling that seem only relevant on a single maturity level as as an example we discuss code signing in version 1.5 let's take a closer look at one of the security practices the requirements driven trusting under verification business function the goal of the requirements driven testing practice is to ensure that implemented security controls operate as expected and satisfied project's stereo security requirements
requirements driven testing security practices divided into two streams control verification and misuse of abuse testing control verification verifies that the application security controls satisfy stereo security requirements and validates their correct functioning these requirements are typically functional in nature negative testing addresses the quality of the implementation of the security controls and aims to detect unexpected design flaws in implementation bugs through misuse and abuse testing these streams align and link activities in the practice or different maturity levels each stream has an objective to be reached and this objective is being reached in the increasing levels of maturity it does so by incrementally building set of security tests and regression cases and executing them regularly through automation the key word here is automation
its most advanced forms they practice promote security stress testing such as denial of service and strives to continuously improve application security by consistently automating security unit test cases and automating security regression test cases for all bugs identified and fixed let's take a look at secure build under implementation this practice focuses on creating consistently repeatable secure build process and accounting for the security of application dependencies the secure build practice emphasizes the importance of building software in a standardized repeatable manner and of doing so using secure components including third-party software dependencies the first stream focuses on removing any subjectivity from the build process by striving for full automation an automated build pipeline can include additional automated security checks
such as static analysis and dynamic currencies to gain further assurance and flag security regressions early by failing the bill for example now we are not recommending that uh from the get go everybody should start failing the belt in fact it's it's reserved for the more mature organizations who have weeded out all the false positives earlier and their high confidence on their security checks such as static and dynamic analysis then only we recommend setting some threshold and criteria to fail the bill but from the beginning you should not start failing the bill for security issues the second stream acknowledges the prevalence of software dependencies in modern applications it aims to identify them and track their security status
in order to contain the impact of their insecurity on otherwise rather secure application in most advanced form it applies similar security checks to software dependencies as to the application itself now let's look at the second security practice secure deployment under implementation now this practice focuses on automatically securing deployments to the production environment and all required secrets one of the final stages in delivering secure software is ensuring the security and integrity of the developed applications are not compromised during their deployment to this end the practice's first stream focuses on removing manual error by automating the deployment process as much as possible and making its success contingent upon outcomes of integrated security verification checks it also fosters separation of duties by
making adequately trained and non-developer responsible for deployment the second stream goes beyond the mechanics of deployment and focuses on protecting the privacy and integrity of sensitive data such as passwords tokens and other secrets required for applications to operate in production environments in simplest form suitable production secrets are moved from repositories and config files into adequately managed digital worlds in most advanced forms secrets are dynamically generated at deployment time and routinely process processes detect and mitigate the presence of any unprotected secrets in the environment my goal is to arm all of you with the knowledge so that you can apply was and 2.0 starting literally monday so let's look at how to apply this model a typical approach rolling out sam two
daughter model includes implementing this six phases in continuous fashion thus creating more like a life cycle approach and each iteration of a cycle you are increasing your level of maturity for security assurance in several aspects before we do that we need to spend some time in preparation this phase is the most critical phase for success of os sam application in your organization i've witnessed several organizations who skipped this phase which resulted in derailing their efforts for ruling out sam so this is the most important phase in my mind it consists of following four activities defining the scope first thing you need to identify is if you would like to roll out osm across the whole organization
across a particular business unit or at an individual application or project level once you define the scope you need to identify key stakeholders and get their buy-in once you get key stakeholder buying you should start spreading the word and evangelize sam activities now communication is the key here so if you really want this implementation to be successful identify who are your key stakeholders and get their buying without that it is very difficult to roll out osm to measure the current level of maturity after you get the buy-in first thing you do is you measure start measuring current level of maturity across your software development life cycle in order to measure this current level of maturity you need to start with an
assessment by conducting interviews with key stakeholders to evaluate current security practices we recommend in-person interviews versus sending over email or slack messages this way you can explain the key intent behind an activity and clarify any potential doubts they might have now there are three ways in which you could perform assessment a lightweight assessment detail assessment and hybrid assessment the lightweight assessment is simply interviewing key stakeholders and recording their response not validating whatever they are saying so that's why it's kind of very efficient but it's a lightweight assessment the accuracy is not great doing lightweight assessment but during detail assessment while it's most accurate assessment you ask for evidence for performance and quality of each activity being performed
so it takes a little bit more cost in hybrid assessment you ask for evidence on a need-to-know basis for some of the activities and not for all activities i personally use hybrid assessment in my experience if you are performing same assessment internally hybrid assessment is the most effective way of performing assessment however if you are involving an external abstract consulting firm to perform an assessment detail assessment is more appropriate now some organizations perform very detailed assessment during the first iteration and perform very lightweight or hybrid assessment during subsequent sam assessments once you record the responses you can assign maturity levels using sam worksheet that we have provided now let's look at how to calculate maturity score
finally we came up with a new scoring model which will still primarily based on the coverage however we added a quality criteria for each question to add another dimension to the score our guidance is to score zero if quality criteria are not met going back to one of the sam core principles we discussed earlier of simplicity we decided to add quality criteria for each question this way time to complete an assessment did not significantly increase with sand 2.0 the overall maturity score for the security practice is calculated by taking average of material level 1 across stream a and stream b and adding that to level 2 and level 3. now once you finish an assessment you
need to define the target as per your organization's business drivers and risk profile as i mentioned earlier it is very important to spend some time understanding your organization's business drivers and risk profile during this exercise once you define the target the most important step is to estimate the cost of implementation sam initiatives fail when folks forget to estimate and plan increased cost of implementation resulting lack of resources dedicated to security improvements that they plan for
the cost of implementation now would become direct input into the defining of the plan or plan definition during this phase you need to determine chain schedule as per upcoming releases and develop and update your roadmap plan over the next four or five phases we recommend implementing sam change or minimum three phases and maximum of five phases each phase can spawn between three to 4 months up to 12 months in order to focus on the highest impact you should start with high impact security improvements at the beginning such as training and awareness threat assessment once plan is defined start implementation implement activities using sam 2.0 guide leverage others os projects as well sam aspires to be an umbrella project
for all was projects that means each os project can map back to one of the sam business functions and security practices now here are some example was projects and how they map back to various sam business functions was due shop and security shefford mapped to the governance business function which helps with the training and awareness aspects there are several resources available for different phases of implementation business functions uh such as um or superior security knowledge framework secure coding dojo and things like that there are several resources available for also for verification business functions one of the most popular one is the security testing guide and both it has the normal for the web application as
well as for the mobile application in terms of the tooling was zap is the most popular security testing tool so you can leverage all these various uh os projects in order to help with the implementation last but not least mod security reset is a project which really is useful for operations business function and if you can look at uh things like projects like was top 10 and was mobile top 10 they can actually map back to multiple of osm business functions and practices the after the implementation we need to create and update scorecards on regular interval by capturing scores from before and after an iteration of an assurance program build out and communicate progress to the
management i cannot stress enough on communication to the senior management here i've used these scorecards to demonstrate and communicate security improvements to the highest level of management the companies that i work for this help management visualize the progress in overall security improvements which results in even more support rolling out next phase of security improvements that is why this is the most crucial phase of implementation of osm
there are some of the other available resources we have to get you started using uh century all these resources are linked to the was samwart orgy website if you don't remember anything just remember this one website all resources are linked to this website i highly recommend reviewing osm quick start guide before you start adopting sam sam toolkit now has both microsoft spreadsheet and google docs versions which helps performing same assessment for both microsoft shops and google shops right earlier we only had this toolkit available in microsoft's spreadsheet format so you can take a use of this both google docs as well as microsoft spreadsheet for conducting assessments building roadmap and plans or also like to point out that concord
usa which is one of our sponsors has contributed an online sam calculator and one of the community members satish has also created online sam 2.0 dashboard together they provide an online equivalent of sam toolkit for those of you who are already using sam 1.5 we have provided a mapping guide from sam 1.5 to sam 2.0 if you have any feedback on this resources please use sam assessment feature request form now one more thing before we conclude this talk we recently introduced sam benchmark initiative which helps answer the question how do i compare to other organizations sam benchmark initiative is inspired by bsim the goal of this project is to collect the most comprehensive data set related
to organizational maturity of application or software security programs from both the self-assessing organizations and abstract consultancies they're both anonymized and verified data collection options depending on your comfort level we totally understand that there are several organizations they respect privacy and we also respect privacy of those organizations that is why we have created an option to submit your assessment results and the data anonymously as someone said proof of the pudding is it in eating so if you haven't yet uh i really invite you to please start using osam 2.0 starting next monday hopefully i'll get chance to come to beside toronto in person next year to discuss the lessons learned if you have any questions always
contact me at my email or hit me up on linkedin connect with me and i can also help provide any questions you might have like to thank beside toronto organizers volunteers and sponsor for putting together this great b-sides conference in toronto uh looking forward to visiting in person next year and for those of you who might not know me i'm from toronto i still have my house there so it's always kind of like a great feeling coming back home so thank you very much to all of you for listening to this talk and if you have any questions i'll be available on discord awesome thank you very much uh yeah there are some questions that have come
in uh we're just gonna have to wrap up unfortunately here um but from my own personal experience i gotta say that this is where you know hapsec hits real life and gets real results so thank you very much for the talk uh i appreciate it and uh and we're gonna wrap up for now again thanks again for for coming and giving the talk