← All talks

Signed Twice, Broken Never: The Rise of Hybrid PKI

BSides Seattle · 202629:2843 viewsPublished 2026-03Watch on YouTube ↗
Speakers
Tags
StyleTalk
About this talk
Bsides Seattle February 27-27, 2026 lecture Presenter(s): Ganesh Mallaya
Show transcript [en]

of you were on the in this room for the previous session when we were talking about honeybotch. Perfect. So that sets up the stage a little bit to continue this topic around uh what I'm going to talk here for the next 20 minutes or so around quantum PKI and hybridization. Right? So we all saw how easy was it on the defcon floor to set up honeypot and then gather some data or from her condo to gather some data and that's exactly what you will see. This was supposed to be my slide number four which I kind of pulled it forward to kind of set the stage and set the tone uh with respect to the topic of honeypot. That kind of

gave a good example of the current two very common or I should say the most uh widely used threat vectors which a lot of adversaries are using with respect kind of a honeypot kind of a scenario in an internet scale which is harvest now decrypt which many of you might have heard with respect to cyber security but a much more evolving uh threat vector which is essentially trust now and forge later which is much more related to certificates and how many of you deal or have dealt with certificates in a in a day-to-day life. There you go. Perfect. Right. So now what is trust now forge later? Right. I mean everyone knows harvesting harvesting of data and that

can be used by some of my adversaries down the line when quantum computers are say reality right which IBM says by 2028 there should be some on the floor. But what happens is on your trust now on your harvest now decrypt it's essentially on your CIA tried you are talking more on the confidentiality of the data but now let's take one step ahead with respect to the integrity and authenticity of a data is when your adversaries are essentially hitting your trusted root authorities which is your certificates your keys which are out there exchanging traffic on an HTTP basis right the claims are that if your signatures fall for in the hands of the wrong adversaries,

they can forge your identities and forged your systems to kind of retrieve the datas out. And that's what we'll talk more about is how uh these two attack vectors can essentially help right with the help of these particular attack vectors in our heads. How can we kind of modernize our PKI infrastructure for both the web and the uh enterprise PKI which is essentially a private set of PKIs and and I am Ganesh Mallaya. I work for a company named appux. uh and essentially I do for my day-to-day job I work a lot with uh the CAP forums who kind of sets up the uh the the mandates for the certificate authorities and the browsers and at the same time I work

with the ITFs to kind of modernize the current upcoming PQC standards with respect to signatures which you'll see what we are working on and obviously uh a lot more under remote station right now why is this real right now right I mean we are talking about postquantum And you'll hear hear more about hybridization of the PKI right wherein we are talking more about how the classical algorithms and the postquantum approved algorithms by NIST which is FIPS 20 34 and 5 can be merged together to make kind of an transition into the full PQC. Right? Why is this real right now? And the reality of that is this is this is u a survey which was done by ABI

uh research uh company which was in conjunction with a lot of the companies like ours the various standards board out there right I mean as you could see the CNSA 2.0 Auto report essentially states what are those particular systems and elements within your enterprises which needs to be transitioned into your PQC and PQC reliable uh say phases by what years. So various standards boards like your European unions or Indian governments, NIST, CISA from US all of them are coming together giving their own mandates with respect to what are these particular changes, how they have to be implemented and how in especially it was very well written in the the recent uh publications given by NIST and

CISA on the adoption of hybrid PI or hybridization of the whole cryptography, right? And that actually brings up one of the key point in this particular graph. If you see is the blue curve, right? Which is essentially a trajectory of the adoption of hybridization in cryptography wherein you will for a foreseeable future until maybe 2035 or 2034. There is going to be a significant uplift or significant trajectory upwards into the consumption of hybridization where maybe it is for hybrid key exchanges which is already in play with respect to internet or it is hybrid say SSL handshakes with respect to you may have a certificate which is having the amalgamation of both the classical algorithms and the PQ safe algorithms

right and and all of this is forced by some of this. This is essentially a nutshell of what these mandates are are forced by some of these particular mandates on when a lot of these current currently used algorithms or even say hashing algorithms or signatures are going to be deprecated. A lot of them are already couple of them like RSA and elliptical curve which is widely used today are going to be duplicated by 2030 2035. Right now we may all say, "Hey, quantum is right around the corner." Right? How many of you when you open your LinkedIn, right? Have your feeds flooded with quantum talks, right? I opened up LinkedIn and there was a guy called Brian C. he works

for STIG, right? And he had a very similar post which I going to talk right now is his posts are flooded with people commenting, posting, talking about, hey, that's going to come up in PQC tomorrow. This is happening in quantum tomorrow. Quantum and quantum cryptography just flooded the internet right now. How accelerated are these adoptions or pilot phases going on right now? And this was a survey which was again done by ABI research right and which essentially tells us five different pillars with respect to if you can put PI in the mix of it. All right. So today at the scale of internet and this is one of the data which Cloudflare openly uh claims as

well. the two of them the first two points right a lot of your CDN providers today be it Akami be it cloudflare even cloudfront today I was talking to a guy from Amazon approximately 40%age of them are already PQ protected when I say PQ protected majority of them have either implemented PQ practices number one and number two they have already implemented internet to consume hybrid key exchanges which is at present today as per cloudflare It's approximately 60 percentage and plus as of January 1st 2026 that is 60%age and more of internet traffic one it is running on TLS 1.3 with hybrid key exchange which is essentially MLKM and X25519 now what does that depict right I mean

we were all talking about will we have PQC in implementation phases or not right 2026 we are seeing a lot more advanced examples of PQC and hybrid PQC or hybrid cryptography being implemented in the internet scale. Now at the same time there are the ecosystems of vendor right I mean the the biggest part of PQ migration is not just changing an algorithm but it also is to see how my backends or my how my infrastructure can support it because inherently the PQ algorithms are so huge that the compute the performance the hardware all of has to support it and come together. So the beginning of the late the tail end of last year and the

beginning of this year we saw a lot more vendors announcing their support for PQ algorithms and hybrid PQ algorithms and at the same time in the ABI research which was done for most of the thousand uh say fortune 100 to Fortune 500 organizations 63%age of the CESOS suggested and said that PKI innovations or PKI I say um modernizing their PI is one of their new uh board level talks to see how they can establish their root of trust to be one of their uh main element from the PQ journey. Now that brings up why PKI right essentially is the board level topic that goes back to the five topics which essentially is going around in and in and around the U uh market

right now. on definitely certificates have been chaos for everyone right whoever deals with it you know how many certificates you may have renewing them revoking them deploying them the endpoint maybe sometime it's in CI/CD pipelines it's a chaos right and we have seen examples of one certificate expired oh my god the entire company now runs around to figure out why it has expired all the drills right and at the same time around last year around I think it was uh April or so uh the gap forum came down drilling down on the validities of these particular certificates coming down from 393 to 47 by the end of 2029 and the first data starts on March 16

this year to 200 which means I cannot still rely on my excel sheets I need to have a repeatable process to ensure these certificates can be renewed deployed efficiently that's the fifth point essentially the crypto agility right and at the same time I had a I have a really good friend who used to work for saleoint right and I am and all had and he said one thing the space of AM identity access management specifically with respect to identities and certificates have evolved so fast and so much in last 2 and 1/2 years that it did not evolve in last decade or so. So today pretty much everything we touch from an identity space has something or

the other to do with an SSL certificate. Now be it a user authentication MTLS side to sidevpns zscalers all these vendors have something ought to do with your certificates to be deployed which means much more added workload on top of your head to ensure that when the PQC hits the floor you need to ensure the root of trust is secured and have a repeatable process to ensure it can be renewed and at the same time all these um government agencies or regulators are mandating various changes is to be implemented for their respective countries at the same time. And all of this comes down to and this this is well written in the the NIST documentation is

the crypto agility path. And again, it's not an end state. It's a milestone number one, right? Crypto agility is not an end state. It's kind of a myth. Everyone thinks that okay, if I can have everything automated, I'm on a cryptogile path. It is not. It's a milestone everyone essentially goes through. Now what is really the path right when you talk about enterprise PKI right and when you talk about web PKI for your public trust what is really the path with respect to enterprise PKI we can adopt today right and if if people are following the ITF forums which essentially defines you lot of standards with respect to signatures and all that and one of the most spoken and most

controversial talk track which has been there in last one and a half year is what is the hybrid certificates or composite certificates. Now what do we know about composite certificates, right? So today you have a certificate which can be the signed by elliptical curve or an RSA, right? Which is a single signature certificate goes onto your internet. It's barely 2,000 bytes. Internet takes it like a piece of cake and then handshakes it. Now come to PQCert which is on the top there like MLDDS87 the biggest of the lot. The key size starts in around 16,000 bytes right? That's a huge thing and then you will see the performance dips. The TCP segments will drop start dropping

packets anything beyond 11 K bytes. So what happens here in composite certificate now the primary reason why a lot of these engineers I mean this is an ITF draft if you open towards the bottom you see the link of it right the primary reason why there was a conversation for composite and composite is not a new topic if you are a cryptographer it's not a new topic it has existed for ages but it has evolved in last 3 years or so to make much more sense with respect to pqc because There needs to be backward compatibility, right? When you switch over to PQC, your current TCP cannot take the load. Number one, and number two, with majority of

our traffic running still on TLS 1.2, TLS 1.2 can't take PQC either. So now, which means I need to switch over to TLS 1.3 with PQC for which half of the internet or I should say 90% of the internet is still not ready for. Now how do I ensure that I can protect my data at the same time have a backward compatibility that all my applications can still run smoothly that's where the composite certificate was really born right now the real advantage here is if you see on the uh yeah if you see over here that's a signature which we are talking really about so today if you open up a certificate you will essentially see RSA 2048 um say RSA 496

etc. So today with the composites signatures, we are really talking about both the PQC and RSA or elliptical curves sitting together as a a single signature within a certificate without breaking how your SSL handshake and TCP kind of understands the signature handshake today. So this was uh I mean Cloudflare and Google were kind of uh very generous enough to give us the labs to kind of test the theory from what we were trying to do with the ITF and this was well tested and right now I don't know how many of you followed Google yesterday they kind of dropped a bomb right where you were here talking about say AI they kind of dropped a huge bomb

on how they are modernizing uh the webpi with something called as a marker tree certificate which actually follows these particular signatures right So these particular signatures kind of give you three different benefits to begin with, right? The number one is it secures your data because you are even though RSA is broken tomorrow morning, you are still having your certificates and your data essentially signed with an additional signature which is PQ safe. Mathematically, yes. When quantum computers hit, we don't know if FIPS 20 34 and 5 will still be viable or the second round. I mean NIST is going through second round of valuations of uh signatures as well at the same time and the second time second thing is you

don't need to lift and shift and build every application ground up to ensure and ensure that you can support PQC algorithm because the composite signatures allows you a huge backward compatibility. So even though you're running an Apache application say in a login page running today the adoption of these particular signatures allows those particular application to run on the web without having to have you to write additional codes for backward compatibility but yes it has to be on TLS 1.2 but when you go to full PQC it needs to be TLS 1.3 which means that needs to be an upgrade so it gives you a runway with respect to making those particular changes. Obviously a lot of

those changes do hit the market by 2030 2035. So you get a enough runway to do it right now. Um this composite signatures are essentially your migration bridge. It is not your end state right and everyone who did work on this draft along with NIST and every other bodies we did understand the downside of it. It is a migration bridge, right? And this migration bridge allows essentially kind of gives you a visual comparison of this is essentially we kind of tried to build it, right? It this so you can imagine that we we kind of drew it on a whiteboard and tried to give give it to an AI to kind of build it. Some of the data is not really very

accurate here. But what you will see very clearly in this is when you see the classical handshake what you are seeing today, right? when you send like and open up an application and try to handshake it, the the the TLS handshake the way it is built understands only one signature, right? Which means you send it RSA or elliptical curve, it understands it, does a client hello, bye-bye and all that and sends and transacts the data. Now there is one more element in TCP, right, which we kind of forget is the key size or the signature size, right? Now today when you do elliptical curve which is which is the biggest one you will see the max

of four 4k bytes right that's the max the TCP uh will see from an SSL certificates today but when you do MLDDSA the the the third column the beginning is 13 point and change which I'll show you the next one which means your page takes forever to kind of load and this was tested with a lot of test labs with a very basic login page to kind of load on a Google browser it takes a lot of time to load. So without having to have a lot of change one on your application and number two on the TCP header, it was very difficult to load a pure PQ certificate. But when we brought in the composite which is the uh

this middle tier, yes, there needed to be certain level of changes to be done on the application layer so that we could put in the composite signature because by default not all the systems do understand these particular signatures. But without making significant amount of changes, the TLS handshake was smooth enough. Right? Yes, the key sizes were still huge enough. They were 9x larger than your standard RSA, the baseline RSA that is 2048, but they still allowed us to do the necessary handshake and say transaction of the data from one one server to another without having to have any issues because at the same time you're getting an RSA and also a quantum resistible so that no man in the middle

can essentially come into come in between and get the data out right and this is the comparison which we were talking about. So today if you look at the classical baseline RSA which is,64 61664 bytes that's the baseline RSA right that's when you create a certificate that's the baseline data which gets out and if you go to the bottom of it that that's the full PQC chain that's 13x larger than your current chain 13x larger and the the tipping point of the TCP segment is at 11K so anything beyond 11k bytes on your TCP segment the segments starts dropping on the performance and the packets at the same time. the composits or you could some uh the world know world knows

it by hybrid by the way hybrid certificate you might have heard of it they are 8x to 9x larger depending on the algorithm you're selecting in this case we took MLDDS 65 but in the previous example you might have seen it was MLDDS 87 so if you take 65 to 87 there's there's a minor change in between the key sizes but it is still yes when compared to RSA it is 9x larger but still cuts down your complete performance by 30%age with respect to PQC and that's what we did it in the cloudflares and Google's lab now uh there are there are few more companies from a production scale perspective trying to implement this from a pilot

pilot stages so what you would see is as the in the um in the breakage zone right if you if you do it from a TLS handshake from a breakage zone perspective for both the the compact and performance cliffs. there was uh a drop of approximate at the at the 11 kilobytes or the 11 say 11k bytes range is where the performance start dropping today and the the composits was somewhere around 10kish right with the the top tier signature of MLDDSA87 the the whole signature was still um say under 11k let me put it that way right and if you do take segments of TCP your composite was still eight segments versus your full PQC is 11 and 11.5

segments which is still higher with respect to TCP because TCP breaks anything after 11 right and obviously there there were much more uh say advancements with respect to bandwidth consumption which is much more important for telecom sector on how the bandwidth is being consumed and how the bandwidth is given out and given in. So that was essentially an overall uh advantages with respect to 30%age of the whole transaction was much more smaller when compared to the full PQ chain. Right now these are all some of the I wouldn't except except for the first block these are very common to know who who are doing what. Uh if you go just Google around uh the the collaboration between

Cloudfare and Google with respect to hybridization you will see a ton of data with respect to two things. One is composite certificates. Another is mark tree certificate which is an adoption of hybridization in the web PKI. So the whole contention is the way the PQC was evolved or has evolved the internet is not going to adopt it as this or cannot adopt it as this. The browsers not browsers cannot do it as this. So there is a much more advanced way of how you can modernize the whole PKI with respect to something called as mark tree. Now markel tree is again a not not new concept. It was born in 1974. So they are adopting that into the world

of PKI to see how that can be implemented. It's again an IATF draft. So please feel free to go and read about it what it does. And again the same composite certificate it is running in GitHub repositories already. There are uh GitHub repo codes of how the composite signatures are used and developed. And at the same time uh there are two of the MAG companies obviously one I have already said it it's Google. There are two of the MAN companies who are implementing in a pilot stage on how these composite certificates could be used for their PQR that is postquantum readiness program for their own intern infrastructure. Same goes with two of the three telecoms. One happens to be in Seattle

and obviously all of these are our pilot people who are advancing and collaborating with the ITF and NIST to kind of test these particular frameworks to build out some baselines. So yes, it is signed twice. This is not very common for uh the certificates to be known for that a certificate is signed twice with two different algorithms. But it is a step towards having much more secure secure data and transaction with respect to not doing anything between now and tomorrow whenever the CQRC is much more relevant. Right? Um and then there are four pillars with respect to uh implementation of this. This is something which we this is an evolving document with NIST as well but these are

the four pillars which we are working through obviously with PQC everyone would have heard auditing and inventory right let's build an inventory let's build an inventory let's build an inventory yes the same happens for PKI too right if you want to move away uh into an hybridized model inventory is one of the key thing to know what are those particular certificates you need to inventory start with your internal PKI to begin with number two the time is gone this particular deck This particular slide was built last year for a different talk. Right? So the time for pilot has already started. These signatures are already live. They are already there available in various crypto libraries. I can name a few. They

are already registered in INA databases which are referred by these browsers to kind of understand the signatures and then evolve through them. So these signatures are already live for people to test it and there are various larger organizations already testing and implementing this. But with these advancements in signatures and PQC comes one of the key important point with respect to today's practices of PKI is today's CPC CPS standards people who deal with certificates and PKI would know this really really well they're outdated right they need a lot of evolution with respect to see with AI coming into picture how can my PKI be on the brink of not breaking on its own right so CPCPA standards yes they are

evolving there are various PKI uh companies which offer both the web trust and the internal trust certificates. They are working towards modernizing how you can deploy much more simpler PKI number one and much more stronger PKF practices. Again it's an active project going within IATF. So please do follow a group called plans pls. It's not a great name for a IATF working group but plans it stands for PKI and various things with respect to certificate transparency and whatnot and I call this as a migration but I also don't like the it's really not a migration it's a transition from point A to point B and the point B for us it's to get to a PQ PQ ready state but there

is a point A.5 which is where we are going to which is the hybridization model because you cannot go full PQC today which means it's really not a migration it is um it's a it's a transition phase which has to ensure you have a lot more accountability with respect to backward compatibility right so today's applications are built in such a way that it runs on the classical algorithms yes that's the world we live in but as we move towards a quantum ready state these applications for today's success they need to be backward compatible which is where the hybrid model do come into picture and for hybrid model to work we have to ensure the backend infrastructure of that is

not toggled with too much and again these standardizations are essential um this is one of the post which I really liked yesterday is that we are talking about PQC today but tomorrow the the security field is evolving or cyber security is evolving so much on a daily basis we might not even know there might be a post postquantum state also Right? We don't know. That's the reason why the backward compatibility and crypto agility these two states are very very important. That's where this thing come into picture. Right? We can build anything which to sustain PQC or classical or post right. But for every migration or transition if you were to rearchitect redeploy everything from the

scratch it's not sustainable. And with AI coming to picture God knows what will happen right. So the crypto agility state defines one of the key elements of it is to have everything defined in such a way yes from start to the end define it in such a way it can be repeatable but at the same time they are backward compactable which means you don't have to rip and replace all these key elements for them to work right and that kind of sums up my talk. I'll just open up for any questions uh if you guys have I know I think what right on time. >> Yeah. Yes.

>> How is a double sign key smaller than a traditional quantum? >> So yeah, so it's it's a good question. How are they sign how are they smaller than a traditional quantum is because of the concatenation process used in creating those particular signatures. So you're not creating uh a signature separately. So, so if you open up the databases and see how these signatures are built, it's one signature, right? So, they are concatenated together in such a way that they they in nutshell create a smaller key length. >> All right. Yep. >> Yeah. Um, great talk by the way. >> Thank you. >> The only question I have is about the end where you're saying that like let's

say there's going to by the point between the backwards compatibility and the transition phase that there might be another jump in between that. What if the jump is beyond what we're designing for at that point? Is it back to square one >> again? Yes. That's that's essentially how things have to be defined. Again, we don't know what is going to be happening by 2035, right? That's where everything is supposed to be deprecated. We don't know what's going to happen between today and 2035. So that's the reason why a lot of these particular states are talking about having backward compatibility baked into the platform itself. Cool. Thank you. Yeah. Thank you.