
yours thanks Julio de guiche Camuto for this sake last two shows in the dog the computer much ago before flying glass okay no no see kumudam photo cache touching English okay so yeah I took about post-mortem crypto and quantum stuff I just to warn you these are very low-tech slides you can notice the handwritten stuff have very bad handwriting I try to do my best and I use my my six-year-old daughters during application on her iPad and a stupid pencil to draw everything might look a bit weird but just it was different okay so to start a cow I was looking for pictures of quantum computer so I used Google and stuff and I looked for fear
of fright images and a Google quantum computer pictures but then I found this so obviously these aren't quantum computers is the same kind of computers you have so III kept searching for quantum computers and then I found this the only way is no it's a quantum scalar pendent energy guard listen it reduces inflammation it promotes unclamping upset and it protects your DNA from damage whatever it means and of course it fights cancer cells it increases energy and so and so and so on an improved blood circulation you can find it on Alibaba I don't know if they have any left it's pretty expensive but come on it's pretty powerful there's just one thing that it
doesn't do you check this list it doesn't break any cryptography so it's kind of useless an actual quantum computer would maybe look like this big fish whereas the cryptography we used today would be the small fish so the purpose of this keynote talk which is the first keynote I give in my life ever is to explain you why this big guy would be able to eat a small guy and how the small guy could defend itself against the big fish so I want in turn into all the details of explaining how quantum physics quark and the ice by questions and all this complex stuff but just try to give you some very basics of the principles behind a
quantum computer and try to convince you that it's not that complicated it has very basic mathematics so what's another computer work with bits a bit is either 1 or 0 one of computers they work with quantum bits that we call qubits and this so we use some scary notation like this alpha times this 0 cubed plus beta times 1 this is just notations we use alpha and beta but I could have used a and B X&Y whatever but the point is is that a qubit is characterized by two numbers alpha and beta and they're not real numbers they're complex numbers so all the form for example a plus bi where I is the imaginary unit and these two
numbers are called amplitudes and amplitudes are a bit like quantum probabilities if you will what did it mean is that if you observe a qubit one said this is a qubit I've never looked at it I prepared it in my lab and it's characterized by two values alpha and beta and then I look at my qubit and then it's either 1 or 0 I will observe it equal to 0 with probability alpha square and which I will observe it being equal to 1 with probability beta square so obviously alpha square plus beta square must be equal to 2 1 and once I've observed it being 0 or 1 then it stays your 1 forever it doesn't
change anymore so the initial state is called superposition which is kind of 0 at 1 at the same time but it's a bit more complicated like you seen in this comic they say ok superposition it doesn't remain 0 n 1 at the same time it's something more complex mathematically you can see it as a combination of stuff but the bottom line is that you can directly explain it using concept that we're familiar with it's quantum it's magic it's it's complicated now how to compute with this kind of thing so on a classical computer we have for example a logic gate or negation a zero comes in and a 1 comes out stupid you have one number in whenever
out with a qubit remember a qubit is characterized by two numbers so let's say we take the qubit zero which is zero with probability one so you know in advance that it will always be zero and it's equal to 1 with probability 0 of course and then you apply this quantum gate and you end up with two different numbers so in this case 1 over square root of 2 and 1 over square root of 2 it means that if you observe it this time then it will be equal to 1 with probability 1 over square root of 2 squared which is 1 over 2 and likewise it will be 0 with probably 1 over 2 ok
but the big observation is that here you have two numbers to remember instead of just one now what happens if you have two objects so 2 bits let's say you have two bits here two numbers and this for example one bit comes out if you have two qubits for example zero and one so two qubits it can take four values two can take the value 0 0 0 1 1 0 and 1 1 so each of these 4 values is characterized by a probability so in this case we always 1/2 1 here always you're there so we are this value has probability 1 and then you can apply some quantum gate for example the swap
gates which will just swap the values of qubits and you will end up with 1 0 set of 0 1 but again the point is that we have four numbers to remember if we deal with two qubits now you see where I'm going if we generalize this to a larger number of qubits but what's also important to sis that this quantum gate it's actually of size four by four because you can see any quantum gate like a matrix like you saw in high school math because the quantum gate is nothing more than linear algebra operations if you were to simulate a quantum computer you would just need vector matrix multiplications so the result here is this vector multiplied by
the matrix representing this gate so the upshot is that any quantum circuit can be assimilated by linear algebra operations which are just Victor matrix multiplications but the point is that the matrix will become bigger and bigger as in abroad qubits grow and if you have n qubits so my state will be characterized if I want to simulate it by 2 to the N numbers and the gate will be 2 to D 2 times n numbers so is it as it's relatively practical if you have maybe 10 or up to 20 cubits but if you have 50 or hundreds of qubits then obviously 2 to the 50 or two to the hundred it becomes very large and
unmanageable was if you had a quantum computer you wouldn't have to remember all these numbers these numbers would be implicit in the in the object you will have only one one stupid cubed here only one object but this object because of quantum mechanics would be kind of the encoding of many numbers okay so that was for the surgical part and so now there is no way I'm here today is that this kind of crazy computer could solve some problems much faster than any classical computer can do in the sense that in terms of surgical complexity you would have a clear speed-up for example a problem that could be solved classically in at least two to the N
over two operations may be solve might be with a quantum computer in this complexity oddity and in a linear time which means if you will exponential usually means impossible and polynomial or linear usually means practical in practice so this kind of speed-up is what we call an exponential speed-up you go from exponential to sub-exponential an exponential speed-up in practice it means going from in the impossible to it possible so that's kind of interesting property in stuff of just using approximation algorithms for problems that you can solve then you might be able to solve this for this problems if you had a quantum computer ok now instead of explaining what is a quantum computer and me briefly highlight three
points about what the quantum computer is not so it's not a super fast computer so many people come to me say about when I'm computer it would be building some times faster say no no wait sorry but it could likely be slower in terms of latency in terms of gates instead of being like you know three gigahertz it might be much slower than this it's not too fast a computer sometimes journalists or people think that it's something that computes all the answers in parallel simultaneously and picks the best one so so it doesn't work like this again it's there's something called quantum / ISM but it's a more complex notion you cannot compute all these
solutions in parallel and then pick the best it's many very different from this and maybe the last but not least it's not something that would solve NP hard problems so NP hard problems is typically the kind of problems that have exponential time complexity for which you can verify the correctness of a solution but for which finding distortion is extremely hard okay so in terms of complexity sorry I don't know if you've studied this or if you're familiar with notions like P and P but I can summarize it very simply like this so we have the class of np-complete problems here which is the the smallest class of hard problems so np-hard problems are at least as hard as and P
completes so these are hard whether you have a quantum computer or not bpp is a bounded error proneural time pro basic polynomial time complexity is the kind of problems that you can find you can solve efficiently on a classical computer like this one and bqp is the equivalent of bpp but for quantum computers so the kind of problems that a quantum computer could solve efficiently now you see that this bubble is bigger than this one which means that there are some problems that can be solved efficiently by quantum computers but not by classical computers another important point about what quantum computer is not there's not this may be heard about the company d-wave if you there are very
good marketing method are very nice website did were interesting by the way but they don't do full-fledged quantum computers they tried to use quantum phenomena to do some dedicated circuits like something bit similar to a six but using quantum stuff and even call them quantum computers but these aren't designed come quantum computers and they will not be able to do this to implement Shor's algorithm short algorithm is maybe the single reason why people care about quantum computer at all it's the gerson that could allow you to go from n equal P Q to the factors P and Q you probably know the implication of this if you can factor in big numbers then you can break RSA sure will so be
able to solve this problem which is a bit less straightforward is the discrete logarithm problem so here you know X you know P you don't know do you want to find D and if you find e then you solve the problem and if you can solve this problem you can break any difficult man operation including elliptic curve the feel man which makes you can break any SSH connection any TLS connection which obviously is not what you want so on a classical computer doing this would take exponential time complexity proximately it would be practically impossible with Shor's algorithm you would go fox from exponential complexity down to cubic complexity which means in principle practical okay so RSA is dead fill man
is dead ECC is dead wait what do we do now do we wait till up so then people start panicking we should switch to post-mortem crypto lalala just more warning it's not that bad well from the information we have today some researchers try to estimate how costly it would be to actually break RSA or ACC and they care about this kind of conclusion the two-factor a 1k RSA modulus it could take approximately one and a half million cubed like you're like okay wait why breaking something with thousand bits would take a million cubits so the main reason is that quantum computers are very unstable devices there's a lot of error interference coming from the environment
and within the qubits themselves so you need many more qubits to to correct the errors that happen inside your quantum computer so there are some surgical theorems that tell you that you can always correct up to a certain amount of of yours but it means that you also need many more qubits and then in terms of latency in terms of time they estimated so again based on information they use available today that to factor a 2k RC model is so now the RSA you use today it's at least 2048 it should take approximately five months so all this order of magnitude it will not be like yeah they wait one second and factory tour will be that easy
okay so how far are we from achieving this okay let me just drink it's logarithmic scale so to light the best quantum computers we have or circuits that compute some stuff using quantum circuits we have between ten and twenty cubits depending on how you count it so if you read the news you have the guys of Google and IBM which I am a bit competing with each other telling you that they will have 40 or 50 cubed systems by the end of the year but then they tell you why maybe they will not work but will try to have 52 bits just for no PR and stuff well anyway this is where we are today and the best
factorization result is that we know how to factor 15 go spoiler of 15 equal 3 times 5 and this is the yeah of the order of thousand bits is the size of an RC modulus or the size of the film and multiplicative group in to try to break and this is the order of magnitude of the number of qubits that we would need so here we are and that's number of qubits we need so how many years so the people we're trying to sell you postpartum crypto will tell you five years by our stuff because now index in two weeks we have quantum computers and then you have researchers who want founding to work on
quantum stuff and they also tell you maybe in ten years no we need money to go to the research and then you go to get us do the very same guy and behind closed doors they will tell you whenever Sukhram comes on in my lifetime but you know you know true story so yeah and another important algorithm is a cover from the guy who invented it it's so-called quantum search so in this case you don't get a exponential speed-up you don't get a quantum speed-up but just a quadratic speed-up so you go from complexity or to the N or the square root of n so what's going on here you have a table of values that are
not sorted and you want to search for one specific specific value so in a classical machine there's no way to cheat you need complexity that is proportional to the size of the audio I on a quantum machine you would take theoretical complexity that would be proportional to the square root of the size of the array so again there's some caveats because there's constant factors hidden in the Big O notation but well the approach is that if you have a symmetric key of n bits then security would be reduced to approximately and over two bits so if you use AES with 128 bits you will conclude that you will only get 64 bit security but then okay you want to you
know read the fine print and see what really means in practice so people try to do the exercise and estimate how slow it could be how big would such a quantum circuit be and they came up with this figure of a circuit of size 2 to the 80 gates you like it really what what does that mean so I tried to count even if a each quantum gate is as small as an atom of hydrogen which is but picometers so then this size is the diameter of the sewer system and then you're like all right but there would be some improvement it will make it much smaller maybe they will end up with something just the size of the moon
maybe which is much smaller than this but it would still be you know pretty tough it will not fit in this room obviously so anyway even if you know you're scared about quantum computers you the solution is straight for why you just double the key size and use aes-256 and then you get 120 bit security okay now let's switch to the future path on this how it looks on ax is company's page how much time huh okay then you have time so you may know that post mortem means an algorithm that would not be broken by quantum computer so obviously it should not rely on factorization or discrete log which means surgically speaking that it should
be the problem of cracking the cipher should be outside of this class that we call bqp so RSA is not passed Morton but AES is postmortem because we only get a quadric speed-up we're still in the exponential complexity domain so now it's not you know people will tell you okay you want which post Contin because we have quantum capture in ten years and some people who have the order by W they say no you should not care about it because it will never exist that's my opinion so these are people's opinion now to be more rational and to think about this in a non binary way well my point is that I see it as an insurance in insurance like
you pay insurance for you know flooding or natural disasters against a very unlikely event and that's kind of risk management know if you're Caesar you have some better to spend how should you prioritize this in terms of investment if you have a range of stuff to protect I often say that this should not be a priority but if you have money left that they kind of ensure that you may want to buy depending on what it's offered to you but it's not it's not binary anyway why should we you know care about this from a pragmatic point of view the main reason why you hear about quantum in postmortem stuff in the news especially these days is that these guys
and I say don't eat winters and say in 20 in August 2015 the co-published a memo recommending that US federal agencies switch to post-mortem crypto in a near-future other remember what kind of turn they use exactly it other developers and operators of the need to transition to quantum resistant lures in the future whatever that means but if you've worked in the public sector in administration in some you know in your countries I'm searching you know that to make things changes takes a lot of time and there's a lot of inertia and latency so in a country like the u.s. might taken all 30 years to switch all the systems from free quantum to postmortem so that may be one of the reasons why
did when it's said this and of course you have some you know conspiracy theories who were like uh-huh they have a quantum computer in their basement and they want to you know to protect themselves and there's many theories but my personal opinion is that you just from the part of you still insurance they know the dead Chinese are also investing in what I'm computing so they want to make sure that on the long term they are protected so what's the what what is in Lincoln quickly is that we have this projection list is the National Institute of Standards and Technology in the u.s. the guys create the standards and who in particular create the crypto standards so the guys
who brought you the AES and in shastri so now if you remember but AES and shastri they were both developed as a kind of public project a public competition we're honest it's very smart from listen from the US in general because they would publish in like enough RFP say oh we want you say first and then look many cryptographers many academics and many Europeans would create new ciphers would as public money from the European Union to the research and for years and years hire postdocs and PhD in organic workshops and stuff and they would send their ciphers to nist you know free or fries or PR three-button free and then for two or three or four years all these
researchers they would again spend public money trying to break the site first and understand them better and implement them an optimized implementation and at the end nist picks the one or the ones that they think best suit their requirements I didn't say the best one but the ones that they think are the best and then we have good standard I think a yes is very good cipher shastri katuk is very nice and the irony is that then the Americans get for free and then they sell the products back to the Europeans with a new standard that your friends developed in the first place but yeah I'm digressing so the upshot is that this kind of project is very fun
I've been involved in induced before and this kind of demolition derby because you publish your your cipher and then your incentive is to try to break the other guy's ciphers see you try to attack them in a variety of ways it try to find weaknesses he trying to create new artificial security notions and claim that their ciphers don't satisfy these security notions yeah a lot of fun but if you noted at the time nine so it starts the call for proposals was published last year the deadline for submissions is the end of November so it's really a time you know from this point of view now you're wondering okay what are these submissions they're not
been published yet they will be published in December I guess there's a few ones that we already know so I put up this webpage it's past mortem that CH is totally you know official not a pro by NIST anyway and there's a few algorithms that have been published early so two of which I go design with a master student and I will talk about this one later now very briefly this five type of post mortem crypto the one stop based on error correcting codes so IDs that date back to the late 70s hush based crypto about the same period late 70s early 80s latest base it's a bit Morrison sits how to explain is the most mathematical one
if you if you will motivate crypto is based on the problem of solving nonlinear equations multiparty questions and the one last one is isogen a basis based on a new kind of problem based on elliptic on elliptic curves but that is not the discrete log that is a bit different so it looks like it's postmortem but people are not totally sure so maybe these four ones are the main ones so in the few time I have left I will try to convince you that touch base crypto is a school and maybe the most secure one not the most efficient one but the ones for which we for which we have the best insurance in terms of physical security
may be the interest of hash based crypto is that you know that it's at least as secure as the hash function behind so obviously you don't want to use md5 because md5 is broken and shouldn't have use this diagram but the magic here is that you create public key crypto so encryption schemes signature schemes and maybe the feel man now why do you only create cynosure schemes for Hajj based but they will this will be secure as as long as the hash function is secure and we are very confident that the hash functions we use today like shastri or Blake too are secure that's a good how does it work so this one is extremely stupid but it's
surprisingly difficult to understand well at least it was for me the first time I saw it so here there's something called one time signature so you want to sign something in the sense of digital signature your secret key will be these two values k0 and k1 so think of these like 256 bit key is for example and the public key is the hash value of K 0 and the hash value of K 1 so you know that this is the hash of K 0 but you don't know K 0 and you know that this one is the hash of K 1 but again you don't know K 1 so I'm the owner of the secret key
and I want to sign a message I want to send a message so I want to demonstrate to you who know the public key that I intended to send this specific message so my message is second very stupid it's either 0 or 1 since your or one and once so if I want to sign the message zero I publish KC or I send you kzo kzo M is my signature and then what do you do you take K 0 and you hush it and you verify that it's equal to the hash of K 0 that you have so this way I prove to you that I own T 0 that I own the secret key that corresponds to
this public key so an attacker couldn't switch to 0 to 1 and forged signature of F 1 because the attacker would not know K 1 this is stupid enough body's completely users because you will need as many keys as there are bits and any key can only be used once because once I've published K 0 K 1 then an attacker no scheduler okay 1 so ok what's the point now let's say you want to sign more than one bit you want to sign a number between 0 and the value W so instead of washing your secret key ones you will hush it many times you will hash it W times so I wrote H subscript
superscript W of K and again K is my secret key and this guy is my public key I've been hashing K and hashed the hash of K and hash the hash of the hash of K and so on and so on and then the center of a number X where X is between 0 and W by between 1 and W is the hash of a K repeated x times and then if you want to verify that this match is my public key then you hush this value again W minus X time until you go back to the public key so we're just hashing stuff and again here the problem is that you can only use a a key once okay that's normal if
you don't understand everything it's very counterintuitive but now the problem is that like I said you can only use a key once so how do you solve this problem so like in many computer science problem when you're stuck what do you do you use a three binary tree in this case we use what is called a miracle tree so it sounds fancy but it's nothing more than a hash tree in the sense that the guy here is the hash value of this and this so you can catenate these two guys you hush hush you hush it and then this is the harsh and again this guy is the hush of the concatenation of these two guys and
that's it so your public key will be the root of the tree and you want to send a message and you say oh I'm going to use k1 here so what do you do you will publish k1 or you will use k1 and along with your signature you will publish this guy and this guy which is called the authentication path and publishing these two guys will allow the verifier to hash this to get this one hash these two and get back the public key and verify that the public key that was computed is equal to the public key that they know enter in the first place so here I have only four values but in
practice you imagine that you have maybe now 50 or 40 or 60 values there and a depth of maybe 40 so if you have it there for 40 then you will Hodge you will compute 40 hash functions now something a bit different so I've talked about one time signature many times on eaters now I'll talk about something few time signatures it's a bit different now this is much more recent the ideas from resin and resin in 2002 and the public key while the secret key is a bunch of keys k0 k1 up to key of N and again the public key is the hush of each of these keys now I want to use I want to sign a
message I will use a hash function but that's a little bit different from the hash functions you're used to my hash function given a message it will give me a bunch of numbers so here it will give me the two numbers 1 and n for instance so what would I do I will publish these two keys k 1 and k n because the hash of M is 1 an N if it were 2 & 3 I would have published K 2 and K 3 okay then I repeat this for the messages and the idea is that I will have as I keep signing messages I will reveal some keys but there will still be many keys that
have not been published but obviously if I keep passing messages and I get random looking numbers here at the end I will publish all the keys and it will become insecure because if I publish all my secret keys then my secret keys are not secret anymore and everything becomes useless so these three types of constructions taken individually they look totally stupid and useless but if you combine them if you combine the one-time sanctuary if you combine the Vinton its many times they ensure with the micro tree and if you add even more trees and you put your trays and trays in trees then you create this it's Sphinx oh so it's a scheme that was
created in 2015 by Dan Burstyn who you may know and other people and the idea of this construction was also to eliminate the state in the sense of being stateless so stateless as opposed to stateful means that you don't need to keep track of a counter the counter would be like okay I've used this key I've used a key number one now I want to use a key too I've used the key number two now I want to use a key number three it eliminates this kind of statefulness by randomizing the index of the key so each time you pick a random key and because it's random it's unlikely you are likely to pick twice the same key
that's the very general ID so it looks nice but there are several problems with string or maybe several limitations the first one is that the signatures are pretty big to get quantum a and B security design choice could be about 40 kilobytes and it's relatively slow but the biggest limitation from my point of view is the complexity you have four types of trees so you have the big hyper tree here and each node is a tree so if none of the trees are itself so at the very root you have a miracle tree and all the leaves of this tree are other types of trees which are one time signature schemes or the winter nests
type that I mentioned before and each node here will sign the roots of the tree one layer below and then you have these trees of trees and stuff and at the very end the very leaves of the hyper tree will be of the host type which is the few times on your scheme I mentioned before and this is the guy that will actually sign the message that you intend to sign and when you publish signatory publish the signature completed with this guy and then you also publish the all the signatures that connect this fruit here to the root of the root of the hyper tree that's a mess and we try to do something simpler but we ended up doing
something that may actually be even more complex it depends on how you look at it so that's what I did with a master student of mine this summer was the first semester of this year and our goal was to simplify to improve things to make it faster or to make it more secure and we call our construction gravity Sphinx so gravity why because we saw that it was as ambitious as the problem of solving the quantum gravity problem so it's kind of things with a bunch of new tricks and the goal of these tricks is to make senators shorter and to make the scheme faster or less slow I will very simply try to present this trick to
give you the idea of what what we did the first one is to use collision-free hashing so I mentioned in this few time stuff you pick a bunch of indy significant budget of numbers but instincts if you were to pick for example 32 numbers then it was like each number was a random number and a fish number is random then it could happen that you get twice the same number that you get twice M index so it meant that you would published instead of publishing two different secret keys you would publish only one and we realize that this this makes the schemes a little less secure so we came up with a modified version of the hash function
that uses a pillar random generator or PRNG where there is no such collision what a risk of collision is in Terraria and based on this we analyzed the security of the scheme and we create a new we wrote new theorems and we proved that we have this level of security if we have this kind of parameters network we use is that we try to make it a little bit faster by creating a trade-off between the key size and and the amount of computation to do so instead of again I touch simplify it instead of having trees of twigs of trees and stuff we have a big tree here that we pre compute so that when you sign a message you only
have to compute this small tree instead of computing a sequence of trees so this makes the key a little bit bigger but this says you lot of computations add another day another trick that we used was batching so let's say you want to sign many messages at the same time let's say you want publish gnu/linux packages and you want to sign everything and instead of computing one signature for each packet you want to to publish one single central for the list of packages so to simplify again we would take all the messages would put them in a tree so again we solve the problem by using a tree and then we just sign the root of this tree so now if you want to
publish the signature of one of these guys instead of publishing all the messages you publish the message and your dedication path a bunch of nods with industry so again we put a tree in a central scheme that you have where you have trees of trees and trees okay so the internals are pretty complex I don't want to bore you with all this it's already starts to be pretty boring but the the bottom line is that we end up with these kind of parameters so some Russians that we intended to submit to NIST maybe the final versions could be a little bit different so we have key size of between 20 K and 34 35 K proximity so
it's still pretty big you might argue that it's not realistic well I have maybe like a know know half a terabyte of memory on on this guy but obviously if you're on very low-end devices sure IOT stuff and you have a limited memory you won't be able to store thousands of thousands of sensors but I think it would fit many use cases in terms of key size of centers I'm sorry the keys on the other hand they can be pretty smooth this public key is there 32 bytes only and the secret keys are 64 bytes plus the cache if you do the cache okay so seriously this plus quantum stuff it's pretty complicated that just
one type of a scheme hash-based the other ones latest based squad-based multi-varied they're totally different from this and not much simpler so the the project of this will take casting at least four or five years because we need to better understand this kind of construction and we need to to be sure that they are secure enough to be to be a standout so like I said the deadline is the 31st of November and in what's the date in in April in April 2018 this could organize a workshop we're also meters will come to the stage and present the submission and co-located with this event there would be a picture crypto 2018 which is well the latest
conference but postponed crypto so it's been running since 2000 ten or seven and remember exactly but it's the main conference but if you're interested in post-mortem crypto that's where them the best resources is published it's nice location is in Miami Florida so hope I will I will go there so yeah that's pretty much all I had to say today don't thank you for coming I will happily take any question that's my contact and a very small commercial plug I just for the book in a it will be published in the coming days it would be available in Amazon on November 21st so if you're interested in in crypto and you may consider this thank you
[Applause]
question so everybody understood the hash based goodness that's so by his book I already did I had early access so yeah I'm going to bring you lots of it yeah yeah I didn't okay yeah thank you guys thank you guys well there's a question that's a question ah yeah you you spoke to her about introducing a work factor to ashing yeah isn't that what what the current password hashing algorithms already do yes so the question is about what factors and hash functions so here we don't explicitly want to introduce you what factor because the walk factor makes hashing slower and using memoria so that's what you want to have in pascal rushing but here we would like to
avoid this cofactor to be as fast as possible so we don't put extra walk on purpose we try to minimize it so this can be used this cannot be used for washing password thanks more questions okay thank you all thank you thank you JP