in copyright - non-commercial use permitted rights ... · dedicate enough years to the study of a...

162
Research Collection Doctoral Thesis Resource theories of knowledge Author(s): Rio, Lídia Pacheco Cañamero Barbieri del Publication Date: 2015 Permanent Link: https://doi.org/10.3929/ethz-a-010553983 Rights / License: In Copyright - Non-Commercial Use Permitted This page was generated automatically upon download from the ETH Zurich Research Collection . For more information please consult the Terms of use . ETH Library

Upload: others

Post on 16-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

  • Research Collection

    Doctoral Thesis

    Resource theories of knowledge

    Author(s): Rio, Lídia Pacheco Cañamero Barbieri del

    Publication Date: 2015

    Permanent Link: https://doi.org/10.3929/ethz-a-010553983

    Rights / License: In Copyright - Non-Commercial Use Permitted

    This page was generated automatically upon download from the ETH Zurich Research Collection. For moreinformation please consult the Terms of use.

    ETH Library

    https://doi.org/10.3929/ethz-a-010553983http://rightsstatements.org/page/InC-NC/1.0/https://www.research-collection.ethz.chhttps://www.research-collection.ethz.ch/terms-of-use

  • Diss. ETH No. 22801

    Resource theories of knowledge

    A thesis submitted to attain the degree of

    DOCTOR OF SCIENCES of ETH ZURICH

    (Dr. sc. ETH Zurich)

    presented by

    Ĺıdia Pacheco Cañamero Barbieri del Rio

    MSc. Physics, Universidade de Aveiro

    born on 26th February 1986citizen of Portugal

    accepted on the recommendation of

    Renato Renner, examinerFernando Brandão, co-examinerPatrick Hayden, co-examiner

    2015

  • To dragons, pineapples and fools, who loved the sun on the window sill.

  • Acknowledgements

    Dedicate enough years to the study of a discipline and it ends up moulding yourviews of the world a little. In my case, information theory has become sieve,rule and compass. The search for abstraction and meaning might have alwaysbeen there, but it was quantum information theory that gave me the tools tofind and distil them, and what began as personal taste became an addiction.This thesis, for example, is a by-product of that addiction. I was half-heartedlywrapping up my PhD, stapling papers together as one does, and writing areview of recent progress in thermodynamics for the introduction, when, inreading through different approaches to resource theories, it occurred to me‘hm, I could generalize this, just to give a more clear overview of known results’.Eight months later, the slight generalization had grown into four chapters andI still didn’t have a PhD—yet I had never been happier in research. As Istop to think ‘how did I get here?’, I realize that how I got here is a series ofexceptional people that guided, inspired and supported me. I owe them muchmore that this acknowledgement, but this is the space I am given to thankthem, so let us start from the beginning.

    In the beginning we will find my family, my first strike of luck. Era nos vossosbraços que me chegava o mundo, e quando o mundo não nos chegava era ver-vosabraçá-lo e tecê-lo novo. Later, I was fortunate to have a handful of excellentteachers at school, and it is remarkable the impact that a good teacher canhave. In particular, I am grateful to Eduarda Rondão, Lúıs Miranda, JorgeCravo and Teresa Burnay, who taught me natural sciences and mathematics,and encouraged me to pursue extra-curricular science projects.

    If not for Eduarda Rondão, I would not have known of the EnvironmentOlympiads, and I would never have met the genial Tomás Goucha, who haschanged my life in uncountable ways. It was with him that science, mathsand history became adventures. He also introduced me to Projecto Delfos, apioneering project based in Coimbra, which teaches maths to talented youngpeople from all over the country. To everyone involved in Delfos, thank youfor all the strange and beautiful mathematics, for taking me in and letting mecrash at the dorms and lectures when I should not really be there, and for yourfriendship. We all know frogs.

    The first time that I was involved in a serious research project was with theparabolic people at FCUP in Porto. I am thankful to the Heavy Metal team(Diogo Rio Fernandes, Mariana Proença and Carlos Costa) for sharing manyan evening writing proposals, tuning high-pitch speakers and aligning lasers,to Carla Carmelo Rosa and Hélder Crespo for their guidance and enthusiasm,and to Francisco Carpinteiro, mechanics sorcerer, who made it all work. I hadno idea of the complexities and hardships of experimental physics, and this

    v

  • project doubled both my respect for the people who choose that path and mycertainty that I was not one of them. It was an incredible experience—thankyou for letting me be part of it.

    Then there was Bruno José Conchinha Montalto, who shattered my preju-dice about applied maths and computer science in particular. He told meabout Turing machines, and computability, and complexity classes and secu-rity proofs. It was my first peek into a whole new world, and he had the gift ofmaking it all seem as casual and natural as eating sausages with nutella andchicken nuggets with ice cream.

    A little later we will find Ricardo Assis Guimarães Dias, my supervisor duringmy final undergrad years at Universidade de Aveiro, who shaped my academicfuture in two fundamental aspects. Firstly, he was an excellent supervisor,and if I am any good at it myself, it is because he showed me the importanceand possibilities of the role; when in doubt, ask yourself What Would RicardoDias Do and you cannot get it too wrong. Secondly, he was tireless in helpingme find a good field and supervisor for my PhD. Even though his own areaof research is strongly correlated systems, we dug together until we foundquantum information theory—and then it became obvious that I could notwork on anything else. It was also he who first said ‘this Renato Renner seemsto be very good, you should try to meet him.’

    And so we get to Renato, my PhD supervisor and truly the heart of all this.Words can only lower bound his importance, and not tightly at that, but letme try. After Ricardo Dias’ suggestion, and before we met in person, I startedreading Renato’s lecture notes. The first chapter was an introduction to clas-sical information theory, and while I was reading it, Bruno retorted ‘don’tbelieve everything that physicists tell you: a probability measure is a positivefunction on a σ-algebra’. Yes, I thought, that it precisely what he writes here,after a soft prelude to the meaning of probability. That was the beauty ofRenato’s lecture notes: they were mathematically rigorous, but he would mo-tivate definitions and results in a very intuitive and clear way. You can findthe same beauty in Renato’s research: he has a deep knowledge of the fieldand he is technically very strong, but it is his knack for finding and motivat-ing the important questions that makes the whole difference. That talent iscalled vision, and it is the first thing a that a PhD student can wish for in asupervisor. The second is people’s skills, and there again I could not have beenluckier. Renato is a caring supervisor, keen on making his students happy andwilling to learn for past mistakes—a rare quality. I have profited enormouslyfrom his supervision, insights, kindness, patience, encouragement, flexibilityand friendship, even through the many unproductive phases of my PhD. I wasoften inspired by his diplomacy, drawn from both empathy and pragmatism.Above everything else, I cherish our long conversations about physical theories,epsilons and deltas, self-awareness, social dynamics, time, specifications, par-enthood, correlations, the scientific process, leadership, semilattices, ontology,

    vi

  • free will, running, thermodynamics, subjectivity, and how to hide textbooks inprison cells. Thank you for all this, and for teaching me that life is too shortto work on problems that are not fundamental (not to be confused with ‘youonly live once’).

    Now I find myself with no words left to express how much of the person Iam today, and of the researcher I am starting to be, I owe to Roger AndrewColbeck. So words I must borrow to say that he taught me how to see, noticeevery tree, understand the light, concentrate on now. Thank you.

    We move on. The original, unpublished work in this thesis was developedwith Lea Philomena Krämer Gabriel, and it was she who kept me going whennothing seemed to work. From office to flatmate, from teacher to student, fromco-actor to co-author, from friend to business partner, Lea has filled moreroles in my life than anyone else—and I wouldn’t want anyone else. Thankyou for always being there for me. Thank you for research over sms, photos,voicemail, scribbled notes, sharelatex, phone calls, email, skype, long-distanceflights, talks and lunch breaks. Thank you for caring about subsystems to thepoint that your first words in the morning could be ‘it might be enough to havedecoupling functions’, and thank you for not getting the slightest bit annoyedwhen my reply was ‘maybe, but can we get breakfast first?’

    The quantum information group at ETH Zurich was a second home for me, andsomething feels wrong about using the past tense there. I would like to thankall my colleagues for the inspiring discussions, companionship, the toastingprotocol, and the friendliest atmosphere I have known. Personal thanks toOscar Dahlsten, Johan Åberg, Normand Beaudry, Marco Tomamichel, DanielaFrauchiger, Cyril Stark, Philippe Faist, Gilles Pütz, Philipp Kammerlander,Mário Ziman, Stefan Hengl, Matthias Christandl, Aleksey Fomins, MichaelWalter, David Gross and Tama ma. To the master students with whom I hadthe privilege to work, Raphaël, Adrian, Philipp, Bettina, Mario: thank you. Imight have learned more from you than the other way around.

    The broader quantum information theory community is equally friendly andengaging, and I learned more things than I can enumerate at conferences andin discussions, in particular with Rob Spekkens, Stephanie Wehner, Lucy Li-uxuang Zhang, Tony Short, Terry Rudolph, Jochen Gemmer and Jens Eisert.Above all, thank you for making me feel welcome. A special thanks to Fer-nando Brandão and Patrick Hayden for all the reasons above and for settingoff to read the rest of this thesis. You might be a bit alarmed by the lengthof these acknowledgements (and of the sentences hitherto), so let me assureyou that I have done my best to keep the rest concise if not concrete. I wouldalso like to pay homage to all the amazing women working on quantum infor-mation, who are too often overlooked. Thank you for being role models, forfighting bias, for the support networks.

    As you probably know, time management is not my strong suit, and things

    vii

  • got particularly complicated when I was due to start my first postdoc withouthaving finished the doc itself. This put my new supervisor, Sandu Popescu, inan impossible situation, for which I take full responsibility. For reasons I cannotphantom, Sandu still took me in and essentially made this thesis happen. Hedid so by showing me that, deep down, the thesis was just a formality thatneeded to be dealt with, and by assuring me that he was looking forward toworking with me after the defence, independently of the beauty of said thesis.This went against all my intuition, and the message took a while to sink in,but now that I have finished I find myself mostly agreeing. Without him, Iwould still be writing. Thank you for your patience, for your clarity, and forcaring.

    The biggest thanks to the incredible people who proofread parts of this thesis,gave me detailed comments, proved (or more often disproved) my conjectures,disputed my notation, found examples and references, and gently reminded meto convey some intuition in between definitions and lemmas. If this thesis is theleast readable, it is thank to them. This goes to Elóısa Grifo & Jack Jeffries,Gilles Pütz, Lea Krämer, Paul Skrzypczyk and Philipp Kammerlander. Also,I am grateful to all the people who gave me strange words to hide in the thesis(of which I managed to sneak in about half).

    A word to friends and family who helped me cope with the past few months.Writing was not always easy, and neither was I. Thank you for standing by andkeeping me afloat, for all the hugs and chocolate, and for caring. This goesto my mother Gisela, song of my heart, to Vassili, you extraordinary storm,to the board games crew, who kept reminding me of how fun and rewardingthinking can be, to Cuartito Azul, in particular Angela and Alexey, who gavestructure to my days and believed that I could learn anything, to Yasmine,companion through many hours of writing, tea and laughter, to Mischa, forhis infinite kindness and banana supply, to Elóısa, life-line over the oceans, tothe Bristol group, who gave me so much to look forward to, and in particularYelena, who shared her digs when mine were bare far, to Axiel, who showedme who I wanted to be, and to Laura, who first taught me love, and that thereis always love in teaching. When it comes to non-technical thesis advice, Iam particularly thankful to my dad José, who passed on to me the timelessprescription ‘break it into three parts, and the writing will flow’ (independentlyof the problem), to Martin, who said ‘remember that you can pronounce PhDas [f:d]’, and to Cajús, who shared Sondheim’s wisdom, ‘stop worrying if yourvision is new; let others make that decision—they usually do’.

    Last but not least, thank you, Simon, for your love, support, and the occasionalKrumphauw that restored sanity. You have seen the worst of it and stayed forthe next chapter. That’s when everything changes.

    viii

  • Kurzfassung

    Inspiriert von Zurgängen zur Thermodynamik aus der Quanteninformations-theorie führe ich ein allgemeines Framework für Ressourcentheorien ein, dasdie Perspektive subjektiver Agenten einnimmt. Ressourcentheorien gehen voneiner Menge an erlaubten Transformationen auf einem Zustandsraum aus,die einfach zu implementierenden Operationen entsprechen oder solchen, diekostenlos ausgeführt werden können—zum Beispiel lokale Operationen undklassische Kommunikation, oder “thermische” Operationen. Die Menge anOperationen bringt eine Quasiordnung auf dem Zustandsraum mit sich, vonder aus wir Ressourcen charakterisieren knnen. Zum Beispiel können wirdann nach notwendigen und hinreichenden Bedingungen für Transformatio-nen suchen, die freien Zustände identifizieren, und versuchen, den RessourcenWert zuzuweisen. All diese Konzepte werden in dieser Arbeit behandelt undverallgemeinert.

    Zuerst formalisiere ich eine Art, subjektives Wissen zu beschreiben mittelssogenannter specification spaces, oder Spezifizierungsräume, in denen Wis-senszustände (oder Spezifizierungen) durch Mengen dargestellt werden, derenElemente die möglichen Zustände der Wirklichkeit sind, die ein Beobachterzulässt. Ich zeige, wie man verschiedene Ansichten der Wirklichkeit über Ein-bettungen zwischen Spezifizierungsräumen miteinander verbindet, wie mankonvexe Zustandsräume und Wahrscheinlichkeiten behandelt, und wie mandie Konzepte von Lernen, Vergessen und Nähe zwischen Wissenszuständenformalisiert.

    Dann führe ich Ressourcentheorien auf Spezifizierungsräumen ein, die man voneiner Menge erlaubter Operationen aus konstruiert. Ich charakterisiere diedaraus resultierende Quasiordnungsstruktur, die freien Ressourcen und kon-servierten Grössen. Zudem zeige ich, wie man verschiedene Ressourcentheorienmiteinander vergleicht, verbindet und neu kreiert.

    Die lokale Struktur auf einem Spezifizierungsraum (die in der Quantenmechanikdurch das Tensorprodukt von verschiedenen Hilberträumen gegeben ist) wirdhier nicht als a priori angenommen, sondern operationell aus der Modularitätim Set der erlaubten Transformationen hergeleitet. Ich charakterisiere Theo-rien mit einer guten lokalen Struktur sowie Korrelationen zwischen Subsyste-men.

    Abschliessend erläutere ich den relativen Wert von Ressourcen, indem ich dieIdee einer Währung einführe: eine lokale Ressource, die es erlaubt, jede andere

    ix

  • Ressource zu generieren, vorausgesetzt, man startet von einer genügenden An-zahl an Kopien. In einer Ressourcentheorie, die Lokalität beschreibt, könnte soeine Währung zum Beispiel durch maximal verschränkte Qubit-Paare gegebensein. Mit einer Währung können wir die Kauf- und die Verkaufskosten vonRessourcen definieren (analog zum entanglement of formation und destillation)und diese nutzen, um notwendige und hinreichende Bedingungen für Ressour-centransformationen zu finden. Es werden dazu noch andere Arten behandelt,Ressourcen zu quantifizieren.

    Anwendungen in der Thermodynamik von Quantensystemen werden in vielzähligenBeispielen erläutert. Insbesondere wird die Annahme, dass thermische Zuständekostenlos zur Verfügung stehen, von Grund auf analysiert. Ich finde notwendigeund hinreichende Bedingungen für typische, subjektive Thermalisierung vonQuantensystemen.

    x

  • Abstract

    Inspired by quantum information approaches to thermodynamics, I introducea general framework for resource theories, from the perspective of subjectiveagents. The idea behind resource theories is that there is a set of allowedtransformations on a state space, corresponding to operations that are easy toimplement, or that can be implemented for free—for instance, local operationsand classical communication, or thermal operations. The set of operationsimposes a pre-order on the state space, and from there we can characterizeresources. For instance, we may look for necessary and sufficient conditionsfor state transformations, identify the set of free states, and try to assign valueto resources. All of these concepts are addressed and generalized in this thesis.

    First I formalize a way to think of subjective knowledge, specification spaces,where states of knowledge (or specifications) are represented by sets whoseelements are the possible states of reality admitted by an observer. We explorehow to conciliate different views of reality via embeddings between specificationspaces, how to treat convex spaces and probability, and how to convey thenotions of learning, forgetting, and closeness between states of knowledge.

    Then I introduce resource theories on specification spaces, which are con-structed from a set of allowed operations. I characterize the pre-order structureimposed by the set of transformations, free resources and conserved quantities.We see how to relate, combine and create different resource theories.

    The local structure of a specification space (which in quantum theory is givenby tensoring different Hilbert spaces) is not assumed a priori. Instead, it isderived operationally from modularity in the set of transformations. I charac-terize theories with a good local structure and correlations between subsystems.

    Finally, I discuss the relative value of resources, by introducing the idea ofa currency: a local resource that, given enough copies, allows us to achieveany other resource. For instance, in resource theories of locality this couldbe a maximally entangled pair of qubits. With a currency we can define thebuying and selling cost of resources (similar to the entanglement of formationand distillation), and use this to find necessary and sufficiency conditions forresource transformations. Other ways to quantify resources are mentioned.

    Applications to thermodynamics of quantum systems are discussed in variousexamples. In particular, the assumption that thermal states are given for freeis analysed from first principles. I find necessary and sufficient conditions fortypical, subjective thermalization of quantum systems.

    xi

  • xii

  • Contents

    Page

    Contents 1

    1 Introduction 5

    1.1 Beginnings: the toddler and the witch . . . . . . . . . . . . . . . . . . . . . 5

    1.2 Resource theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

    1.3 Models for thermodynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

    1.4 Generalizing resource theories . . . . . . . . . . . . . . . . . . . . . . . . . . 15

    1.5 Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

    2 The formalism of specification spaces 19

    2.1 Specification spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

    2.2 Learning and forgetting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    2.3 Homomorphisms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    2.4 Embeddings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

    2.5 Approximation structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

    2.6 Convexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

    2.6.1 Probability and convexity . . . . . . . . . . . . . . . . . . . . . . . . 33

    2.6.2 Probabilistic versions of specifications . . . . . . . . . . . . . . . . . 34

    2.6.3 Mixtures of specifications . . . . . . . . . . . . . . . . . . . . . . . . 35

    2.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

    3 Resource theories 37

    3.1 Definition and constructions . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

    3.2 Pre-order on specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

    3.3 Free resources and conservation laws . . . . . . . . . . . . . . . . . . . . . . 40

    3.4 Convex resource theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

    3.5 Approximate transformations and robustness . . . . . . . . . . . . . . . . . 43

    3.6 Combining and relating resource theories . . . . . . . . . . . . . . . . . . . 44

    3.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

    1

  • CONTENTS

    4 Subsystems 47

    4.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

    4.2 Local specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

    4.3 Strong local structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

    4.4 Local resource theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

    4.5 Correlations between systems . . . . . . . . . . . . . . . . . . . . . . . . . . 56

    4.6 Permuting subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

    4.7 Copies of resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

    4.8 Memories, catalysts and erasure . . . . . . . . . . . . . . . . . . . . . . . . . 60

    4.9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

    5 Value of resources 63

    5.1 Counting resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

    5.2 Currency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

    5.3 Figures of merit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

    5.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

    6 Relative thermalization 71

    6.1 The fundamental postulate of statistical physics . . . . . . . . . . . . . . . . 71

    6.2 Anomalous heat flows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

    6.3 Relative thermalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

    6.4 Typicality of relative thermalization . . . . . . . . . . . . . . . . . . . . . . 73

    6.4.1 A note about the proof . . . . . . . . . . . . . . . . . . . . . . . . . 76

    6.5 Converse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

    6.6 Relevance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

    6.7 Outlook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

    7 Conclusions and outlook 79

    7.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

    7.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

    7.2.1 Improving and expanding the framework . . . . . . . . . . . . . . . . 80

    7.2.2 Applying the framework . . . . . . . . . . . . . . . . . . . . . . . . . 81

    A Basics of algebra and order theory 83

    A.1 Algebraic structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

    A.2 Order structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

    A.3 Functions in order theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87

    A.4 Galois connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

    A.5 Quotient spaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

    B Properties of specification embeddings 95

    B.1 General properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95

    B.2 Quotient specification spaces . . . . . . . . . . . . . . . . . . . . . . . . . . 96

    B.3 Convexity and embeddings . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

    2

  • CONTENTS

    B.3.1 Convex mixtures of specifications . . . . . . . . . . . . . . . . . . . . 99

    C Commutativity and subsystems 101C.1 Commutant and bicommutant . . . . . . . . . . . . . . . . . . . . . . . . . . 101C.2 Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103

    D Properties of strong local structures 107D.1 General properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107D.2 Partial trace induces a strong local structure . . . . . . . . . . . . . . . . . 107

    E Smooth entropies 113E.1 Smooth entropy measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

    E.1.1 Smooth min- and max-entropies . . . . . . . . . . . . . . . . . . . . 113E.1.2 Generalized smooth entropy . . . . . . . . . . . . . . . . . . . . . . . 115E.1.3 Basic properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115E.1.4 Chain rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116E.1.5 Relations between the different smooth entropies . . . . . . . . . . . 117

    E.2 A profusion of little lemmas for smooth entropies . . . . . . . . . . . . . . . 118E.2.1 A few more definitions . . . . . . . . . . . . . . . . . . . . . . . . . . 118E.2.2 A couple of trivial bounds for the smooth entropies . . . . . . . . . . 119E.2.3 Three recycled lemmas . . . . . . . . . . . . . . . . . . . . . . . . . . 121E.2.4 Smoothness of Hε and relation to Hε

    ′min . . . . . . . . . . . . . . . . 123

    F Decoupling theorems 129

    G Relative thermalization: proofs 133G.1 Thermalization of typical subsystems . . . . . . . . . . . . . . . . . . . . . . 133G.2 Converse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135G.3 Dimension bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

    H Erasure in the presence of a quantum memory: proof 139H.1 Result and proof outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139H.2 Finding a pure subsystem . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

    H.2.1 Decoupling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140H.2.2 Purification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141

    Bibliography 143

    3

  • CONTENTS

    4

  • Chapter 1

    Introduction

    1.1 Beginnings: the toddler and the witch

    If physical theories were people, thermodynamics would be the village witch. Over thecourse of three centuries, she smiled quietly as other theories rose and withered, survivingmajor revolutions in physics, like the advent of general relativity and quantum mechanics.The other theories find her somewhat odd, somehow different in nature from the rest,yet everyone comes to her for advice, and no-one dares to contradict her. Einstein, forinstance, called her ‘the only physical theory of universal content, which I am convinced,that within the framework of applicability of its basic concepts will never be overthrown.’

    Her power and resilience lay mostly on her frank intentions: thermodynamics has neverclaimed to be a means to understand the mysteries of the natural world, but rather a pathtowards efficient exploitation of said world. She tells us how to make the most of someresources, like a hot gas or a magnetized metal, to achieve specific goals, be them movinga train or formatting a hard drive. Her universality comes from the fact that she does nottry to understand the microscopic details of particular systems. Instead, she only caresto identify which operations are easy and hard to implement in those systems, and whichresources are freely available to an experimenter, in order to quantify the cost of statetransformations. Although it may stand out within physics, this operational approachcan be found in branches of computer science, economics and mathematics, and it plays acentral role in quantum information theory—which is arguably why quantum information,a toddler among physical theories, is bringing so much to thermodynamics.

    In the early twentieth century, information theory was constructed as the epitomeof detachment from physics. Its basic premise was that we could think of informationindependently of its physical support: a message in a bottle, a bit string and a sensitivephone call could all be treated in the same way. This level of abstraction was not originallyconceived for its elegance; rather, it emerged as the natural way to address very earthlyquestions, such as ‘can I reliably send a message through a noisy line?’ and ‘how muchspace do I need to store a picture?’. In trying to quantify the resources required by thosetasks (for example, the number of uses of the noisy channel, or of memory bits), it soonbecame clear that the relevant quantities were variations of what is now generally known

    5

  • 1. INTRODUCTION

    as entropy. Entropy measures quantify our uncertainty about events: they can tell us howlikely we are to guess the outcome of a coin toss, or the content of a message, given someside knowledge we might have. As such, they depend only on probability distributionsover those events, and not on their actual content (when computing the odds, it does notmatter whether they apply to a coin toss or to a horse race). Information theory has beengreatly successful in this approach, and is used in fields from file compression to practicalcryptography and channel coding.

    But as it turned out, not all information was created equal. If we zoom in and tryto encode information in the tiniest support possible, say the spin of an electron, we facesome of the perplexing aspects of quantum physics: we can write in any real number, butit is only possible to read one bit out, we cannot copy information, and we find correlationsthat cannot be explained by local theories. In short, we could not simply apply the oldinformation theory to tasks involving quantum particles, and the scattered study of quirkyquantum effects soon evolved into the fully-fledged discipline of quantum informationtheory. Today we see quantum theory as a generalization of classical probability theory,with density matrices replacing probability distributions, measurements taking the placeof events, and quantum entropy measures to characterize operational tasks.

    While quantum information theory has helped us understand the nature of the quan-tum world, its practical applications are not as well spread as for its classical counterpart.Technology is simply not there yet—not at the point where we may craft, transport andpreserve all the quantum states necessary in a large scale. These technical limitationsgave rise to resource theories within quantum information, for instance theories of entan-glement. There, the rough premise is that entangled states are useful for many interestingtasks (like secret key sharing), but distributing entanglement over two or more agents bytransporting quantum particles over a distance is hard, as there are always losses in theprocess. Therefore, all correlated states become a precious resource, and we study how todistil entanglement from them using only a set of allowed operations, which are deemed tobe easier to implement—most notoriously, local operations and classical communication.

    Other resource theories started to emerge within quantum information—purity andasymmetry have also been framed as resources under different sets of constraints—andthis way of thinking quickly spread among the quantum information community. Asmany of its members have a background in physics and an appetite for abstraction, itwas a natural step for them to approach thermodynamics with such a framework in mind.Their results strengthen thermodynamics, not only by extending her range of applicabilityto small quantum systems, but also by laying bare her fundamental principles.

    In this introduction we will quickly review what has been done so far. We do so inthree parts. First we analyse the building blocks and mechanisms that are common tomany resource theories (Section 1.2). Then we go over applications to thermodynamics,and justifications behind different models, with a few examples (Section 1.3). I will keepthis part short—after all this is only the introduction, and we will get a chance to revisitthese subjects later. Finally, we briefly discuss the idea of generalizing resource theories(Section 1.4). As we will see, this thesis is devoted to that same effort; an outline of thethesis structure is given in Section 1.5.

    6

  • 1.2 Resource theories

    1.2 Resource theories

    Let us introduce the basic ideas behind resource theories that can be found in the literature.The first step is to fix the state space S, which is usually compatible with a compositionoperation—for instance, quantum states together with the tensor product, in systems withfixed Hamiltonians. The next step is to define the set of allowed state transformations,T . For thermodynamics, these try to model adiabatic operations—like energy-preservingreversible operations, and contact with a heat bath.

    The set of allowed operations T induces a structure on the state space: we say thatρ → σ if there is an allowed transformation from ρ to σ. The relation → is a pre-order,that is, a binary relation that is both reflexive (ρ→ σ) and transitive (ρ→ σ and σ → τimplies ρ→ τ ; this results from composing operations one after the other). These conceptswill be explored in detail in Chapter 3.

    The task now is to find general properties of this structure. A paradigmatic example islooking for simple necessary and sufficient conditions for state transformations. The mostgeneral case are functions such that

    • ρ → σ =⇒ f(ρ, σ) ≥ 0 (that is, f(ρ, σ) ≥ 0 is a necessary condition for statetransformations), or

    • f(ρ, σ) ≥ 0 =⇒ ρ → σ (that is, f(ρ, σ) ≥ 0 is a sufficient condition for statetransformations).

    If we have a necessary and a sufficient condition that only differ by a constant (independentof the dimension of the systems involved, for instance), we say that the condition is tight.Often, we try to find necessary and sufficient conditions as functions that can be writtenlike f(ρ, σ) = g(ρ) − h(σ). In the special case where g = h for a necessary condition(ρ → σ =⇒ g(ρ) ≥ g(σ)), we call g a monotone of the resource theory. For example, inclassical, large-scale thermodynamics, the free energy is a monotone.

    In order to quantify the cost of state transformations, we often fix a minimal unit interms of a standard resource that can be composed. For example, in entanglement theorythe standard resource could be a pair of maximally entangled qubits, and in quantumthermodynamics we could take a single qubit (with a fixed Hamiltonian) in a pure state.The question then is ‘how many pure qubits do I need to append to ρ in order to transformit into σ?’ or, more generally, ‘what is the cost or gain, in terms of this standard resource,of the transformation ρ→ σ?’ [1, 2, 3].

    One may also try to identify special sets of states. The most immediate one would bethe set of free states: those that are always reachable, independently of the initial state.In standard thermodynamics, these tend to be what we call equilibrium states, like Gibbsstates. Another interesting set is that of catalysts, states that can be repeatedly used toaid in transformations. We will revisit them shortly.

    7

  • 1. INTRODUCTION

    1.3 Models for thermodynamics

    Now that we have established the basic premise and structure of resource theories, we maylook at different models for resource theories of thermodynamics, which vary mostly onthe set of allowed operations. In the good ‘spherical cow’ tradition of physics, the trendhas been to start from a very simple model that we can understand, and slowly expandit to reflect more realistic scenarios. In general there are two types of operations allowed:contact with a thermal bath and reversible operations that preserve some thermodynamicquantities. Each of those may come in different flavours.

    Noisy and unital operations. In the simplest case, all Hamiltonians are fully degen-erate, so thermal states of any temperature are just fully mixed states, and there are nospecial conserved quantities. In this setting, thermodynamics inherits directly from thetheory of noisy operations. We may model contact with a thermal bath as compositionwith any system in a fully mixed state, and reversible operations as any unitary operation.Furthermore, we assume that we can ignore, or trace out, any subsystem. Summing up,noisy operations have the form

    T (ρA) = TrA′(UAB

    [ρ⊗ 1B|B|

    ]U †AB

    ),

    where A′ is any subsystem of AB and U is a unitary matrix. Alternatively, we may allowonly for maps that preserve the fully mixed state, T : T (1) = 1, called unital maps.The two sets—noisy operations and unital maps—induce the same pre-order structure inthe state space. In this setting, majorization is a necessary and sufficient condition forstate transformations [4]. Roughly speaking, majorization tells us which state is the mostmixed. Let r = (r1, r2, . . . , rN ) and s = (s1, s2, . . . , sN ) be the eigenvalues of two states ρand σ respectively, in decreasing order. We say that r majorizes s if

    ∑ki=1 ri ≥

    ∑ki=1 si,

    for any k ≤ N . In that case ρ→ σ; monotones for this setting are information-theoreticalentropy measures [5, 6, 3, 7, 8]. For example, if ρ majorizes σ, then the von Neumannentropy of ρ, H(ρ) = −Tr(ρ log2 ρ), is smaller than H(σ). For a review, see [9].

    Thermal operations. The next step in complexity is to let systems have non-degenerateHamiltonians. The conserved quantity is energy, and equilibrium states are Gibbs states ofa fixed temperature T . For instance for a system A with Hamiltonian HA, the equilibriumstate is πA = e

    −HA/kT /Z. We can model contact with a heat bath as adding any systemin a Gibbs state—this corresponds to the idealization of letting an ancilla equilibrate fora long time. A first approach to model physical reversible transformations is to allow forunitary operations U that preserve energy—either absolutely (〈Ei|ρ|Ei〉 = 〈Ei|UρU †|Ei〉for all energy eigenstates {|Ei〉}) or on average (Tr[Hρ] = Tr[H(UρU †)] for specific states).Finally, we are again allowed to forget, or trace out, any subsystem. Together, thesetransformations are called thermal operations,

    T (ρA) = TrA′(UAB [ρ⊗ πB]U †AB

    ),

    8

  • 1.3 Models for thermodynamics

    where A′ is any subsystem of AB and U is an energy-conserving unitary. Examples ofmonotones found so far are different versions of the free energy, depending on the exactregime [10, 8, 11, 12, 8] (see Example 1.1). It is worth mentioning we can build necessaryconditions for state transformations from these monotones, but tight conditions are onlyknown for classical states (states that are block-diagonal in the energy eigenbasis) andsingle qubits. In the limit of a fully degenerate Hamiltonian, we recover the resourcetheory of noisy operations.

    Example 1.1 (Free energy as a monotone.). This is an example of finding mono-tones for the resource theory of thermal operations [10]. We are interested in findingthe optimal rates of conversion between two states ρ and σ, in the limit of manyindependent copies,

    R(ρ→ σ) := supR

    limn→∞

    ρ⊗n → σ⊗Rn.

    If both R(ρ → σ), R(σ → ρ) > 0, and these quantities represent optimal conversionrates, then the process must be reversible, that is, R(ρ→ σ) = 1/R(σ → ρ); otherwisewe could build a perpetual motion engine, and the resource theory would be trivial.The idea is to use a minimal, scalable resource α as an intermediate step. We canthink of α as a currency: we will sell n copies of ρ for a number of coins, and use themto buy some copies of σ. To formalize this idea, we define the selling and buying costof a state ρ, or more precisely the distillation and formation rates,

    RD(ρ) := R(ρ→ α)

    RF (ρ) := R(α→ ρ) = 1RD(ρ)

    .

    In the optimal limit we have the process

    ρn → αnRD(ρ) → σnRD(ρ)RF (σ) ⇔ ρn → σnR(ρ→σ),

    which gives us the relation

    R(ρ→ σ) = RD(ρ)

    RD(σ).

    Now we have reduced the question to finding the distillation rate, and that dependson the specific choice of α. In a simple example, ρ and σ could be two classical statesof a qubit with Hamiltonian H = ∆|1〉〈1| (that is, ρ and σ are diagonal in the energyeigenbasis). For our currency, we choose the pure state α = |1〉〈1|, with the sameHamiltonian. In this case, and because we work in the asymptotic limit, we can useinformation-compression and typicality tools to find the distillation rate [10]. Theauthors find that it is given by the relative entropy between ρ and the thermal state

    9

  • 1. INTRODUCTION

    π,

    RD(ρ) = D(ρ‖π)= Tr(ρ(log ρ− log π))= β(Fβ(ρ)− Fβ(π)),

    where Fβ(ρ) = 〈E〉ρ − β−1H(ρ) is the free energy of ρ at inverse temperature β. Allin all, we find the conversion rate

    R(ρ→ σ) =Fβ(ρ)− Fβ(π)Fβ(σ)− Fβ(π)

    .

    Now we can apply this result to find a monotone for a single-shot scenario: in orderto have ρ → σ we need in particular that R(ρ → σ) ≥ 1. In other words, we requireFβ(ρ) ≥ Fβ(ρ), thus recovering the free energy as a monotone for the resource theoryof thermal operations. If we start working directly in the single-shot regime, werecover a whole family of monotones [8], of which the free energy is a member.

    Gibbs-preserving maps. Following the example of the theory of noisy operations,we could try to replace these thermal operations with so-called Gibbs-preserving maps,that is, maps such that T (π) = π. This constraint is easier to tackle mathematically,and the two resource theories induce the same pre-order on classical states, leading to acondition for state transformation called Gibbs-majorization (which is majorization after arescaling of the eigenvalues) [12]. However, Gibbs-preserving maps are less restrictive thanthermal operations for general quantum states [13]. For example, suppose that you havea qubit with the Hamiltonian H = E |1〉〈1|, and you want to perform the transformation|1〉 → |+〉 = (|0〉+ |1〉)/

    √2. This is impossible through thermal operations, which cannot

    create coherence; yet there exists a Gibbs-preserving map that achieves the task. We maystill use Gibbs-preserving maps to find lower bounds on performance, but at the momentwe cannot rely on them for achievability results, as they are not operationally defined.

    Coherence. The difference between thermal operations and Gibbs-preserving maps isnot the only surprise that quantum coherence had in store for thermodynamics enthusiasts.The question of how to create coherence in the first place led to an intriguing discovery[14]. In order to achieve the above transformation |1〉 → |+〉 through thermal operations,we need to draw coherence from a reservoir. A simple example of a coherence reservoirwould be a doubly infinite harmonic oscillator, H =

    ∑∞n=−∞ n∆ |n〉〈n|, in a coherent state

    like |Ψ〉 =∑a+N

    n=a αn |n〉. Lasers approximate such reservoirs, which explains why we canuse them to apply arbitrary transformations on quantum systems like ion traps.1 Onemay ask what happens to the reservoir after the transformation: how much coherence is

    1The notion of coherence defined here is quantum mechanical coherence, which should not be confusedwith optical coherence. In particular, the coherent states of quantum optics can act as coherence reservoirsbut they are by no means the only states that can do so.

    10

  • 1.3 Models for thermodynamics

    used up? Can we use the same reservoir to perform a similar operation in a new system?The unexpected answer is that coherence is, in a sense, catalytic: while the state of thereservoir is affected, its ability to implement coherent operations is not. What happensis that the state of the reservoir ‘spreads out’ a little with each use, but the propertythat determines the efficacy of the reservoir to implement operations stays invariant. Inmore realistic models for coherence reservoirs, where the Hamiltonian of the reservoir hasa ground state, the catalytic properties hold for some iterations, until the state spreadsall the way down to the ground state. At that stage, the reservoir needs to be rechargedwith energy to pump up the state again. Crucially, we do not need to supply additionalcoherence. In the converse direction, we know that coherence reservoirs only are critical inthe single-shot regime of small systems. Indeed, in the limit of processing many copies ofa state simultaneously, the work yields of doing it with and without access to a coherencereservoir converge [15].

    Catalysts. The catalytic nature of coherence raises more general questions about cata-lysts in thermodynamics. Imagine that we want to perform a transformation ρ → σ in asystem S, and we have access to an arbitrary ancilla in any desired state γ. Now supposethat our constraint is that we should return the ancilla in a state that is �-close to γ:

    ρS ⊗ γA → σSA : ‖σA − γA‖1 ≤ �.

    The question is whether we can overcome the usual limits found in thermal operations byuse of this catalyst. In other words, can we perform the above transformation in caseswhere ρ→ σ would not be allowed? It turns out that if no other restrictions are imposedon the catalyst, then for any finite � and any two states ρ and σ, we can always find a (verylarge) catalyst that does the job [8]. These catalysts are the thermodynamic equivalent ofembezzling states in LOCC [16]. However, if we impose reasonable energy and dimensionrestrictions on the catalyst, we recover familiar monotones for state transformations [8, 17].These restrictions and optimal catalysts result from adapting the concept of trumpingrelations on embezzling states [18, 19] to the thermodynamic setting. In particular, if wedemand that � ∝ n−1, where n is the number of qubits in the catalyst, we recover the freeenergy constraint for state transformations [17]. A relevant open question, motivated bythe findings of catalytic coherence, is what happens if we impose operational constraintson the final state of the catalyst. That is, instead of asking that it be returned �-closeto γ, according to the trace distance, we may instead impose that its catalytic propertiesstay unaffected. It would be interesting to see if we recover similar conditions for allowedtransformations under these constraints.

    Clocks. All of the resource theories mentioned allow for energy-preserving unitary op-erations to be applied for free. That is only the ‘first order’ approach towards an accuratetheory of thermodynamics, though. Actually, in order to implement a unitary operation,we need to apply a time-dependent Hamiltonian to the systems involved. To control thatHamiltonian, we require very precise time-keeping—in other words, precise clocks, and weshould account for the work cost of using such clocks. Furthermore, clocks are clearly out

    11

  • 1. INTRODUCTION

    of equilibrium, and using them adds a source of free energy to our systems. Includingthem explicitly in a framework for work extraction forces us to account for changes intheir state, and ensures that we do not cheat by degrading a clock and drawing its freeenergy. First steps in this direction can be found in [20]. There, the goal is to imple-ment a unitary transformation in a system S, using a time-independent Hamiltonian. Forthis, the authors introduce an explicit clock system C hat runs continuously, as well as aweight W that acts as energy and coherence reservoir. The global system evolves under atime-independent Hamiltonian, designed such that the Hamiltonian applied on S dependson the position of the clock—which effectively measures time. The authors show thatsuch a construction allows us to approximately implement any unitary operation on S,while still obeying the first and second laws of thermodynamics. Furthermore, the clockand the weight are not degraded by the procedure (just like for catalytic coherence). Inparticular, this result supports the idea behind the framework of thermal operations: thatenergy-conserving unitaries can approximately be implemented for free (if we neglect theinformational cost of designing the global Hamiltonian). Note that this is still an idealizedscenario, in which the clock is infinite-dimensional and moves like a relativistic particle. Arelevant open question is whether there exist realistic systems with the properties assignedto this clock, or alternatively how to adapt the protocol to the behaviour of known, real-istic clocks. That direction of research can be related to the resource theory of quantumreference frames [21, 22, 23, 24].

    Example 1.2 (Heat engines). The extreme case where one of our resources is initself a second heat bath is of particular interest. This is a very natural scenario intraditional thermodynamics: steam engines used a furnace to heat a chamber, andexploit the temperature difference to the cooler environment. The study of this limitled to landmark findings like trains, fridges and general heat engines, and to theoreticalresults on the efficiency of such engines. One might wonder whether these findings canalso be applied at the quantum scale, and especially to very small systems composedonly of a couple of qubits [25, 26]. The answer is yes: not only is it possible to buildtwo-qubit heat engines, but they achieve Carnot efficiency [27, 28]. It is possible tobuild heat engines that do not require a precise control of interactions, in other words,that do not require a clock [27, 29].

    Free states and passivity. It is now time to question the other assumption behind theframework of thermal operations: that Gibbs states come for free. There are two mainarguments to support it: firstly, Gibbs states occur naturally under standard conditions,and therefore are easy to come by; secondly, they are useless on their own. The first point,typicality of Gibbs states, is essentially the fundamental postulate of statistical mechanics:systems equilibrate to thermal states of Gibbs form. This assumption is discussed andultimately justified from first principles in Chapter 6. The second point is more subtle.Pusz and Woronowicz first introduced the notion of passive states, now adapted to thefollowing setting [30, 31, 32]. Let S be a system with a fixed Hamiltonian H, in initial

    12

  • 1.3 Models for thermodynamics

    state ρ. We ask whether there is a unitary U that decreases the energy of S, that is

    Tr(ρH) > Tr(UρU †H).

    If we can find such a unitary, then we could extract work from S by applying U and storingthe energy difference in a weight system. If there is no U that achieves the conditionabove, then we cannot extract energy from ρ, and we say that the state is passive. Thelatter applies to classical states whose eigenvalues are decreasing in the energy eigenbasis.However, suppose that now we allow for an arbitrary number n many copies of ρ and aglobal unitary Ugl. The question becomes whether

    Tr(ρH) >1

    nTr(Ugl ρ

    ⊗n U †gl Hgl),

    where Hgl is the global Hamiltonian, which is the sum of the independent local Hamilto-nians of every system. If this is not possible for any n, we say that ρ is completely passive,and it turns out that only states of Gibbs form, ρ = e−βH/Z, are completely passive.Moreover, Gibbs states are still completely passive if we allow each of the n subsystemsto have a different Hamiltonian, as long as all the states correspond to the same inversetemperature β. This justifies the assumption that we may bring in any number and shapeof subsystems in thermal states for free, because we could never extract work from themalone—another resource is necessary, precisely a state out of equilibrium.

    Different baths. The results outlined above suggest that thermodynamics can be treatedas information processing under conservation laws, and so researchers began to experimentwith other conserved quantities, like angular momentum [33, 34, 7], using the principle ofmaximum entropy to model thermal momentum baths. The state of those baths has againan exponential Gibbs form, with operators like L replacing H. The same type of mono-tones emerged, and similar behaviour was found for more general conserved quantities[35, 36].

    Finite-size effects. Another setting of practical interest is when we have access to aheat bath but may not draw arbitrary thermal subsystems from it. For instance, maybewe cannot create systems with a very large energy gap, or we can only thermalize a fixednumber of qubits. In this case, the precision of state transformations is affected, as shownin [37], and we obtain effective measures of work cost that converge to the usual quantitiesin the limit of a large bath.

    Single-shot regime. Some of the studies mentioned so far characterize the limit ofmany independent repetitions of physical experiments, and quantify things like the av-erage work cost of transformations or conversion rates [10, 15]. The monotones found(like the von Neumann entropy and the usual free energy) are familiar from traditionalthermodynamics, because this regime approximates the behaviour of large uncorrelatedsystems. As we move towards a thermodynamic theory of individual quantum systems,

    13

  • 1. INTRODUCTION

    it becomes increasingly relevant to work in the single-shot regime. Some studies con-sider exact state transformations [1, 2, 7], while others allow for a small error tolerance[38, 6, 39, 3, 11, 12, 35, 36]. The monotones recovered correspond to operational entropymeasures, like the smooth max-entropy (see Example 1.3), and variations of a single-shotfree energy that depend on the conservation laws of the setting; in general, they can bederived from quantum Rényi relative entropies [40] between the initial state and an equi-librium state [8, 41]. Single-shot results converge asymptotically to the traditional onesin the limit of many independent copies. The relation between single-shot and averageregimes is studied via fluctuation theorems in [42].

    Definitions of work. In classical thermodynamics, we can define work as some form ofpotential energy that can be stored for later use. For instance, if a thermodynamic processresults in the expansion of a gas against a piston, we can attach that piston to a weight,that is lifted as the gas expands. We count the gain in gravitational potential energy aswork—it is well-ordered energy that can later be converted into other forms, according tothe needs of an agent. A critical aspect is that at this scale fluctuations are negligible,compared to the average energy gain. In the regime of small quantum systems, this nolonger holds, and it is not straightforward to find a good definition of work. Without aframework for resource theories of thermodynamics, a system for work storage is often leftimplicit. One option is to assume that we can perform any joint unitary operation USBin a system S and a thermal bath B, and work is defined as the change in energy in thetwo systems manipulated, W := Tr(HSB ρSB)− Tr(HSB USB ρSB U †SB), where H is the(fixed) Hamiltonian of system and bath, and ρ the initial state [37]. Another example,inheriting more directly from classical thermodynamics, assumes that we can change theHamiltonian of S and bring it in contact with an implicit heat bath [43]; work at a timet is then defined as

    W (t) :=

    ∫ t0dt′Tr

    (ρS(t

    ′)dHS(t

    ′)

    dt′

    ).

    To study fluctuations around this average value, we consider work to be a random variablein the single-shot setting—this is explored by fluctuation theorems. Note that in theseexamples work is not operationally motivated; rather it is defined as the change of energythat heat cannot account for. Resource theories of thermodynamics, with their conserva-tion laws, force us to consider an explicit system W for work storage. We act globally onS ⊗W , and we can define work in terms of properties of the reduced state of W . Oneproposal for the quantum equivalent of a weight that can be lifted, for the resource theoryof thermal operations, is a harmonic oscillator, with a regular Hamiltonian H =

    ∑n n

    epsilon |n〉〈n|. The energy gaps need to be sufficiently small to be compatible with theHamiltonian of S; in the limit � → 0 the Hamiltonian becomes H =

    ∫d� |�〉〈�| [11, 28].

    Average work is defined as Tr(HW ρfinalW )−Tr(HW ρinitialW ), and fluctuations can be studied

    directly in the final state of the work storage system, ρW . This approach also allows us toobserve other effects, such as the build up of coherences in W , and of correlations betweenW and S. Another advantage is that we can adapt the storage system to other resourcetheories: for instance, we can have an angular momentum reservoir composed of many

    14

  • 1.4 Generalizing resource theories

    spins, and count work in terms of polarization of the reservoir [34].

    Example 1.3 (Landauer principle and the work cost of information processing). Howmuch energy is needed to perform logical operations? What are the ultimate limitsfor heat dissipation of computers? These questions lie at the interface between ther-modynamics and information theory, are of both foundational and practical interest.As Bennett realized, all computations can be decomposed into reversible operationsfollowed by the erasure of a subsystem [44]. If we assume that the physical supportof our computer is degenerate in energy, we recover the setting of noisy operations, inwhich unitaries are applied for free. That way, the thermodynamic cost of computa-tion is simply the cost of erasure, which is defined as taking a system from its initialstate ρ to a standard, predefined pure state |0〉 (like when we format a hard drive).Rolf Landauer first proposed that the work cost of erasing a completely unknown bitof information (think of a fully mixed qubit) in an environment of temperature T iskBT ln 2 [45]. That very same limit was also found for quantum systems, in the settingof thermal operations [46, 37], for the ideal case of an infinitely large heat bath andmany operations; finite-size effects are analysed in [37].

    Using Landauer’s principle as a building block, we can approach the more generalquestion of erasing of a system that is not in a completely unknown state, but ratherabout which we have partial information. For example, imagine that we want toperform an algorithm in our quantum computer, and then erase a subsystem S (whichcould be a register or ancilla). The rest of our computer may be correlated with S,and therefore we can use it as a memory M , and use those correlations to optimizethe erasure of S. In short, we want to take the initial state ρSM to |0〉〈0|S ⊗ ρM ,erasing S but not disturbing M . It was shown [6, 3] that the optimal work cost of thattransformation is approximately H�max(S|M)ρkBT ln 2, where � parametrizes our errortolerance and H�max(S|M)ρ is the smooth max entropy, a conditional entropy measurethat measures our uncertainty about the state of S, given access to the memory M .It converges to the von Neumann entropy in the limit of many independent copies.In the special case where S and M are entangled, it may become negative—meaningthat we may gain work in erasure, at the cost of correlations. Not incidentally, theseresults use quantum information processing techniques to compress the correlationsbetween S and M before erasure.

    1.4 Generalizing resource theories

    Let us now abstract from particular resource theories, and think about their commonfeatures.

    Starting from the pre-order. As mentioned in Section 1.2, the set of allowed trans-formations T imposes a pre-order structure (S,≤) on the space of transformations S. Onedirection towards exploring the concept of resource theories could be to start precisely

    15

  • 1. INTRODUCTION

    from such a pre-order structure. That was the approach of Lieb and Yngvason, who pi-oneered the idea of resource theories for thermodynamics [1, 2]. In their work, the set ofallowed transformations is implicitly assumed, but we work with an abstract state spaceequipped with a preorder relation. They were largely inspired by classical, macroscopicthermodynamics, as one may infer from the conditions imposed on the state space, buttheir results can be applied to thermodynamics of small quantum systems [7]. Assumingthat there exist minimal resources that can be scaled arbitrarily and act as ‘currency’,the authors obtain monotones for exact, single-shot state transformations. When appliedto the pre-order relation on classical states that emerges from thermal operations, thesemonotones become single-shot versions of the free energy [7].

    Starting from the set of free resources. In [41], Brandão and Gour characterizegeneral quantum resource theories based on the set of free resources of each theory. As-suming that the set of free states is well-behaved (for instance, that it is convex, and thatthe composition of two free states is still a free state), they show that the relative entropybetween a resource and the set of free states is a monotone. This is because the relativeentropy is contractive (non-increasing under quantum operations); the same result appliesto any contractive metric. Finally, they find an expression for the asymptotic value of aresource in terms of this monotone: the conversion rate between two resources is given bythe ratio between their asymptotic value.

    In category theory. Coecke et al. have generalized the space of a resource theory toobjects known as symmetric monoidal categories [47]. These can represent essentially anyresource that can be composed (in the sense of combining copies of different resources, liketensoring states in quantum theory). The authors consider both states and processes aspossible resources. After obtaining the pre-order structure from a set of allowed operators,resource theories can be classified according to several parameters. For instance, theauthors identify quantitative theories (where having more of a resource helps, like forthermal operations) and qualitative ones (where it helps to have many different resources).They give varied examples of resource theories, which illustrate just how general thisconcept is.

    Cryptographic resource theories. Classical and abstract cryptography are fieldswhere resource theories emerge naturally: constraints and subjectivity of agents are thename of the game, and always treated explicitly. A first attempt towards generalizingcryptographic resource theories can be found in [48].

    1.5 Thesis outline

    In this thesis I go one step further into the bottomless depths of abstraction and gener-alization by considering resources as states of knowledge. Approximate resource transfor-mations can also be treated explicitly. The notion of subsystems in the space of resources(therefore the composition of different resources) is derived operationally. This framework

    16

  • 1.5 Thesis outline

    covers any resource theory that I am aware of, and the reader is naturally encouraged tofind counterexamples. After the defence. The thesis is conspicuously divided into threeparts.

    1. In this introduction we reviewed some aspects of resource theories of thermodynam-ics. This motivates the main part of the thesis, a generalization of resource theories.

    2. In Chapters 2 to 5 we leave thermodynamics aside and set up a generalized frameworkfor resource theories of knowledge. First we introduce the concept of specificationspaces, which formalize a way of thinking about subjective knowledge (Chapter 2).We will be able to talk about states of knowledge, learning and forgetting, approx-imations, embeddings in higher theories, and how to treat probabilistic mixtures ofstates.

    Then we equip the specification space with an abstract set of allowed transforma-tions, to obtain a resource theory (Chapter 3). We will be able to formulate theapproaches reviewed here in that framework. We will study the pre-order structureimposed by the theory on the specification space, derive the concept of free states,discuss convexity and approximate transformations. We will also see how to combineand relate resource theories.

    The next step is to derive a subsystem structure from modularity of transformations(Chapter 4). We introduce the notion of local resources and how to combine them,we characterize independent systems and introduce the concepts of catalysts andmemories.

    Finally, we discuss the value of resources (Chapter 5). There we derive the notionof a currency and how to use it to find necessary and sufficient conditions for statetransformations. Finally, we relate currencies to other measures of value, like work.

    3. In Chapter 6 we return to thermodynamics, to discuss the question of free statesin thermodynamics, and study thermalization of quantum systems from a new per-spective. We will see that the notion of thermalization that is normally sought afteris fundamentally incomplete. Then I introduce and explore the concept of relativethermalization, and investigate whether it occurs typically in Nature [49].

    Contributions. Wherein I clarify the contributions I made in the course of my PhD:

    1. This introduction was written for the thesis, and will be part of a review pa-per on recent approaches to thermodynamics, The role of quantum information inthermodynamics—a topical review, in preparation together with John Goold, MarcusHuber, Arnau Riera and Paul Skrzypczyk.

    2. Chapters 2, 3, 4 and 5 are original, unpublished work, developed by me in collab-oration with Lea Krämer. The concept of specification spaces is first mentionedin [48], and approximation structures and resource theories on specification spacesare drafted in working notes by Renato Renner and Ueli Maurer, in the context of

    17

  • 1. INTRODUCTION

    cryptographic resource theories. Although in spirit their concepts are very similarto ours, and Chapter 2 was directly inspired by those notes, we take a differentapproach from the first definitions.

    3. I was also first author in two research papers that highlight the role of subjectiv-ity in thermodynamics of quantum systems. The first paper, The thermodynamicmeaning of negative entropy, with co-authors Johan Åberg, Renato Renner, Os-car Dahslten and Vlatko Vedral, concerns the thermodynamic cost of information-processing tasks, in the present of a quantum memory [6]. In this thesis, it is referredto in Chapter 5 (Example 5.3) and explained in more detail in Appendix H. Partof the framework and applications of those results are omitted here, and I refer theinterested reader to the original paper for a complete picture.

    4. The second paper, Relative thermalization, with Adrian Hutter, Renato Renner andStephanie Wehner [49], can be found in Chapter 6, and proofs stand in Appendix G.In that paper, we study thermalization of quantum systems in relation to a refer-ence system, like a quantum memory, and argue that, to ensure that two bodiesbehave according to the traditional laws of thermodynamics, they should be ther-malized relative to each other. The paper also includes minor lemmas characterizingthe smooth entropy Hε and how it relates to the smooth min- and max-entropies.Descriptions of smooth entropies and these lemmas can be found in Appendix E.

    Note to the reader. Previous knowledge of quantum theory or thermodynamics is notnecessary to follow the exposition of the framework in Chapters 2 to 5, although it helpsto understand the intuition behind some notions, and many of the examples (in particularin Chapter 4, where we discuss subsystems). Quantum theory is however essential tofollow Chapter 6, about thermalization of quantum systems. Basic notions of algebra andorder theory are introduced in Appendix A and are central to the whole framework. Aspronouns go, the author uses ‘we’ when she wants the reader to feel included and whenreferring to coauthors of the original papers, ‘I’ when she does not want to incriminateanyone else, and ‘she’ when explaining the difference. Let us begin.

    18

  • Chapter 2

    The formalism of specificationspaces

    In this chapter, we introduce a new way to look at how we organize and process knowledge,in the form of specifications. It is operationally motivated and very intuitive, but it maystill get fairly technical at times: all the necessary notions of algebra and order theory aredefined in Appendix A.

    States of knowledge as sets. Before we dig in, let us start with a few very simpleexamples. Imagine that I draw a box, and tell you that inside it there is either a sheep ora fox. How would you represent your knowledge of the animal inside the box? The bestyou can do is say it is one of the two—and we can represent this by a set {sheep, fox},whose two elements are the possible states of reality. In quantum mechanics, the situationis more subtle. If I tell you that inside the box there is one of two quantum states ρ or σ,chosen uniformly at random, you can represent your knowledge via a mixture of the twostates with equal probability, ρ+σ2 . However, if I just tell you ‘it is either ρ or σ’, you haveno reason to assign a particular probability distribution to the two options, and you canonly represent your knowledge as the set {ρ, σ}, meaning ‘it is either ρ or σ, and I don’tknow the probability distribution.’

    Specificity of a description. Some descriptions of knowledge are less specific thanothers. For instance, if I say I saw an elephant, I could mean either an African or Asianelephant, and you wouldn’t know which. In other words, the specification ‘elephant’corresponds to a set of two elements: {African elephant, Asian elephant}. Each of theseelements also correspond to a set, for instance ‘African elephant’= {male African elephant,female African elephant}, and so on. A special example of this in quantum theory is thedescription of reduced density states. What do you mean when we say that we have alocal state ρA in a subsystem A? In terms of the global Hilbert space, this knowledgecorresponds not to a single state, but to the set of all global states that have marginal ρin A. In the following we will formalize these ideas about knowledge, and we will revisitthese examples.

    19

  • 2. THE FORMALISM OF SPECIFICATION SPACES

    2.1 Specification spaces

    We start with our state space; this is simply a set Ω. The elements of Ω are the mostprecise descriptions that an observer may have of elements of reality—in that sense theyare epistemic. We do not impose any special structure on Ω.

    Example 2.1 (Quantum state-space). For a quantum-mechanical resource theory,the state space could be the set Ω of all normalized density operators over a fixedglobal Hilbert space H.

    Example 2.2 (Generalized probability theories state-space). One approach to gen-eralized probability theories (GPTs) considers a fixed number N of black boxes withinputs A = (A1, A2, . . . , AN ) and outputs X = (X1, X2, . . . , XN ). Each box repre-sents one experiment or event; the input corresponds to a setting that an experimenteris free to choose, and the output is just the perceived outcome of the experiment. Forinstance, boxes could represent fiducial measurements of physical systems. In thissetting, the state-space could be the set P of all conditional probability distributionsPX|A. In another example, we could impose a space-time structure on the boxes, and

    allow only for non-signalling distributions, PNS .

    Example 2.3 (Zoologic state-space). For resource theories in the animal kingdom,the elements of Ω correspond to the best possible description of different animals—forinstance, ‘Asian elephant’, if we could identify them down to the species. For a theoryof single animals, Ω consists of all known animal species.

    Specifications are states of knowledge for a given observer, represented by subsets ofΩ, which consist of all the possibilities that the observer admits for Ω. For instance, thespecification {ω, ν} expresses the notion of ‘I know that we either have state ω or ν.’ Theinclusion relation has an immediate interpretation: U ⊆ V means that V is a less specificdescription of U .

    Definition 2.1 (Specification space). Let Ω be a set. The specification space of Ω, SΩ, isthe power set of Ω, excluding the empty set,

    SΩ = 2Ω\∅.

    Elements of SΩ, which are subsets of Ω, are called specifications.

    We will comment on the significance of the empty set in Section 2.2.

    Remark 2.2. A specification space is a poset ordered by inclusion, (SΩ,⊆). In particular,it is a join-semilattice (SΩ,∪), where the join operation is the union of sets. We may definethe meet of two specifications via set intersection ∩, but the structure is not closed under∩ (because we may reach the empty set).

    20

  • 2.1 Specification spaces

    Figure 2.1: Specification space. A schematic representation of a representation spaceSΩ. The top is Ω, and at the bottom we have all the singletons, {ω} with ω ∈ Ω. Thespecification {ω, ω′} contains both singletons and can be interpreted as the knowledge thatwe either have state ω or ω′. Note: in real life, this graphical representation would notlook like a triangle, but rather like an inflated balloon, as the number of elements on eachrow would be given by the binomial coefficients

    (|Ω|k

    ), starting at the bottom with k = 1

    and ending at the top with k = |Ω|.

    See Fig. 2.1 for a graphical representation. Note that at this stage we do not allow forprobability assignments to states, of the sort ‘I know that we have ω with probability 70%and ν with probability 30%.’ This is because we do not want to impose a notion of proba-bility at the level of specifications. In general, if a theory allows for some interpretation ofprobabilities, they are already included in the state space—that is, Ω is a convex set. Thisis the case of quantum theory, where density matrices can be seen as probabilistic mixturesof quantum states. Convexity and probabilities are further discussed in Section 2.6.

    Example 2.4 (Quantum specifications). In quantum theory, specifications are ele-ments of SΩ, where Ω is the set of density matrices over the global Hilbert space H.For instance the specification {ρ, σ} means that the observer knows that H is either instate ρ or σ. Note that if the observer assigned a probability distribution (p, 1− p) tothose two possibilities, they could specify their knowledge with a new density matrix,{p ρ+ (1− p)σ}.

    A specification of particular interest is a description of a subsystem. If an observersays that subsystem A of H is in state ρA, this implies that her knowledge is limited tosystem A; the rest of the universe could be in any compatible state. In our framework,this is represented by the marginal specification ρ̂A ∈ 2Ω, which is the set of all statesof H that have ρA as marginal,

    ρ̂A := {ω ∈ Ω : ωA := TrA (ω) = ρA}.

    21

  • 2. THE FORMALISM OF SPECIFICATION SPACES

    Figure 2.2: Left: two leafcutter ants. [The stronger of the two. Licensed under CC BY2.0 via Wikimedia Common.] Right: an alligator snapping turtle, species Macrochelystemminckii. [Source: U.S. Fish and Wildlife Service National ]

    Another example is the specification of independence between two subsystems A andB of H, ⊗

    AB

    := {ω ∈ Ω : ωAB = ωA ⊗ ωB} = {ω ∈ Ω : I(A : B)ω = 0},

    where I(A : B)ω denotes the quantum mutual information between A and B.

    Example 2.5 (GPT specifications). An example of a specification in a box world witha given space-time structure is the set of all local probability distributions, which iscontained in the specification of all probability distributions allowed by quantum me-chanics. Another specification could be the set of all locally tomographic distributions.

    Example 2.6 (Zoologic specifications). In single-animal theories, ‘Mammal’ is a spec-ification: a set that includes all animals that are mammals; ‘Marsupial’ and ‘Banana-eater’ are other intuitive examples. Note that all sets of animals are specificationsof an observer’s knowledge (for instance, {Leafcutter ant, Macrochelys temminckii}means that the observer knows that he is before one of those two animals, bizarre asthat may sound; see Fig. 2.2), but as we will see ahead, there are special specificationswith natural interpretations.

    2.2 Learning and forgetting

    The intuitive notion of forgetting some information corresponds to going from a specifica-tion W to a less specific one, W ′ ⊃ W . A function f on a specification space such thatf(W ) ⊇ W is called an inflating function, and corresponds to some type of forgetting.Learning, on the other hand, corresponds to combining knowledge, which is expressed bythe intersection of different specifications.

    22

  • 2.2 Learning and forgetting

    Figure 2.3: Two egg-laying mammals. Left: platypus. [“Platypus” by Stefan Kraft - pho-tographed on the 20/9/2004 at Sydney Aquarium.] Right: Western long-beaked echidna,one of the four remaining species of echidna. [Long-beakedEchidna, by User:Jaganath- Transferred from English Wikipedia. Licensed under CC BY-SA 3.0 via WikimediaCommons.]

    Definition 2.3. Let SΩ be a specification space. Two specifications V,W ∈ SΩ are saidto be compatible if V ∩W 6= ∅.

    Example 2.7 (Knowledge combination in zoology). As an example of the intersectionof several specifications in a single-animal theory, we have

    Mammal ∩ Egg-laying ∩ Extant︸ ︷︷ ︸[not extinct]

    = Platypus ∪ (Extant ∩ Echidna),

    a set with 5 elements (see Fig. 2.3).a

    a You might wonder why I present such a contrived combination. Originally I thought thatMammal ∩ Egg-laying = Platypus would be a very simple and intuitive example, only to find thatthere exist more egg-laying mammals, all in Oceania, and that several more species are now extinct.

    Example 2.8 (Being wrong). This example illustrates why we exclude the emptyset from the specification set. Imagine that you and I both saw an animal, which Iam sure was a tiger and you are certain was an elephant. If we tried to combine ourknowledge, we would obtain a contradiction: formally, Tiger∩Elephant = ∅. In otherwords, the two specifications are not compatible.

    The empty set corresponds to the idea of being wrong—having contradictingknowledge. If we are wrong, we may believe that we achieve anything: we can goto any specification with the operation of forgetting, since the empty set is containedin all sets. We would obtain a trivial resource theory where everything is possible—a

    23

  • 2. THE FORMALISM OF SPECIFICATION SPACES

    theory of the booboisie, where we could create supernovas via hypermethylation ofbananas.

    Example 2.9 (Learning and forgetting in quantum theory). A simple example ofknowledge combination in quantum theory would be {ρ, σ}∩ {σ, τ} = {τ}. But thereare more interesting examples out there. For instance, consider the tensor product oftwo quantum states, like ρA ⊗ σB. This state sums up the knowledge of three facts:system A is locally in state ρA, system B is locally in state σB, and A and B areindependent. Formally,

    ρ̂A ∩ σ̂B ∩⊗AB

    = {ω ∈ Ω : ωA = ρA} ∩ {ω ∈ Ω : ωB = σB} ∩ {ω ∈ Ω : ωAB = ωA ⊗ ωB}

    = {ω ∈ Ω : ωAB = ρA ⊗ σB}= ̂ρA ⊗ σB.

    An intuitive example of forgetting is taking the partial trace, as ρ̂AB ⊂ ρ̂A. Wewill revisit subsystem composition and the partial trace in Chapter 4.

    2.3 Homomorphisms

    Now it is time to introduce functions on specification spaces. We may use functions tocharacterize certain aspects of specifications (like the colour of an animal) or to expresstransformations on the state-space (like quantum operations).

    The idea that specifications are states of knowledge is reflected by having functionsthat act individually on each element of a specification. These correspond to semilatticehomomorphisms, which preserve the structure of a specification space.

    Remark 2.4. Let SΩ and SΣ be two specification spaces. Then, for any function f :SΩ → SΣ, these two statements are equivalent:

    1. f is a join-semilattice homomorphism, that is, for any set W ⊆ SΩ of specifications,

    f

    ( ⋃W∈W

    W

    )=

    ⋃W∈W

    f(W );

    2. f is an element-wise function, that is, there exists a function f̃ : Ω→ SΣ such that

    f : SΩ → SΣ

    W 7→⋃ω∈W

    f̃(ω). (2.1)

    In the following, we use the expression ‘homomorphisms’ to denote join-semilatticehomomorphisms.

    24

  • 2.3 Homomorphisms

    Remark 2.5. Any function between two state-spaces, f : Ω → Σ can be used to build ahomomorphism,

    f : SΩ → SΣ

    W 7→⋃ω∈W{f (ω)}.

    For simplicity, we may denote f({ω}) by f(ω).

    Lemma 2.6. The composition of any two element-wise functions in SΩ is also an element-wise function.

    Proof. Take two element-wise functions, f(W ) =⋃ω∈W f̃(ω) and g(W ) =

    ⋃ω∈W g̃(ω).

    We have

    g ◦ f(W ) = g

    ( ⋃ω∈W

    f̃(ω)

    )

    = g

    ⋃ω∈W

    ⋃ω′∈f̃(ω)

    {ω′}

    =⋃ω∈W

    ⋃ω′∈f̃(ω)

    g̃(ω′)

    =:⋃ω∈W

    g̃ ◦ f(ω),

    where we defined the function

    g̃ ◦ f : Ω→ SΩ

    ω 7→⋃

    ω′∈f̃(ω)

    g̃(ω′).

    The following lemma concerns knowledge combination and homomorphisms: it saysthat, given two specifications, we always get a more precise description by combining themfirst and applying an homomorphism second than the other way around. We will see arelevant application in Section 2.5, when we discuss approximation structures.

    Lemma 2.7. Let SΩ and SΣ be two specification spaces, and let W ⊆ SΩ be any subset ofcompatible specifications (that is, ∩W 6= ∅). Then, for any homomorphism f : SΩ → SΣ,

    f(∩W) ⊆⋂

    W∈Wf(W ) ∈ SΣ.

    25

  • 2. THE FORMALISM OF SPECIFICATION SPACES

    Proof. Using the fact that f is a homomorphism, and therefore we can write f(W ) =⋃ω∈W f̃(ω), we have

    ⋂W∈W

    f(W ) =⋂

    W∈W

    ( ⋃ω∈W

    f(ω)

    )

    =⋂

    W∈W

    ( ⋃ω∈∩W

    f(ω)

    )∪

    ⋃ω∈W\(∩W)

    f(ω)

    ⋂W∈W

    ( ⋃ω∈∩W

    f(ω)

    )=

    ⋃ω∈∩W

    f(ω)

    = f(∩W).

    Definition 2.8 (Equality of homomorphisms). Let f and g be two homomorphism froma specification space SΩ to another, SΣ. We say that f = g if f(W ) = g(W ), for allW ∈ SΩ.

    Remark 2.9. It is sufficient to demand that f(ω) = g(ω), for all ω ∈ Ω, because the twofunctions are homomorphisms.

    Example 2.10 (Quantum endomorphisms). Trace-preserving completely positivemaps (TPCPMs) map density operators to density operators, and therefore may beused to build quantum specification endomorphisms (Remark 2.5). For example, aunitary operation is applied to a quantum specification as

    U : 2Ω → 2Ω

    W 7→⋃ω∈W{UωU †}.

    Example 2.11 (Animal homomorphisms). Let Ω be the state-space of single animalsand Σ the set of all colours. Now take the function that maps an animal to its colours,f̃ : Ω→ SΣ; for instance f̃(Tiger) = {Orange, Black, White}. We can use f̃ to builda homomorphism f : SΩ → SΣ which maps a specification to all possible colours of

    26

  • 2.4 Embeddings

    its elements. For instance,

    f({Tiger, Leopard})= f̃(Tiger) ∪ f̃(Leopard)= {Orange, Black, White} ∪ {Yellow, Black, White}= {Orange, Black, White, Yellow}.

    As an example of a function that is not a homomorphism, consider g : SΣ → SΩ tobe the function that maps a set of colours to the set of animals that exhibit preciselythose colours. Clearly g(V ∪W ) 6= g(V ) ∪ g(W ); for instance,

    Tiger ∈ g({Orange, Black, White})Tiger /∈ g(Orange) ∪ g(Black) ∪ g(White).

    Here, g is not a homomorphism because it does not treat specifications as states ofknowledge. That is, under g, a set Z ∈ SΣ does not reflect the idea ‘I have one ofthese colours, and I do not know which’, but rather ‘I have exactly all of these colourswith certainty’. Indeed, one could use Remark 2.5 to build a homomorphism from gin a larger space,

    g′ : SSΣ → SΩ

    Z 7→ g(Z).

    For instance, and element of SSΣ

    could be

    Z = {{Orange, Black, White}, {Black, White}},

    meaning ‘either I have the first three colours or last two’, and examples of animalsthat exhibit one of the two sets of colours are

    {Tiger, Clownfish, Zebra} ⊂ g′(Z).

    2.4 Embeddings

    We said before that the state-space Ω corresponds to an observer’s most precise descriptionof reality. This is not necessarily the most complete description possible: another observermay know that there are more things in heaven and earth than are dreamt of in Ω.Additional information about reality can come in two flavours: either Hamlet simplyknows of more elements of reality, in which case his state-space is an extension of Ω, orhe has more specific descriptions of the elements of Ω. These two cases are formalized asextensive and intensive embeddings of specification spaces.

    Embeddings, Galois connections and related notions are defined in Appendix A. Theshort version is that an embedding e : SΩ 7→ SΣ is a map that satisfies V ⊆W ⇔ e(V ) ⊆

    27

  • 2. THE FORMALISM OF SPECIFICATION SPACES

    intensive embeddingextensive embedding

    h

    Figure 2.4: Embeddings. Two extreme examples of embeddings of a specification spaceSΩ into a larger one, SΣ. On the left, an extensive embedding: here, e(Ω) ⊆ Σ. On theright, an intensive embedding, defined by the Galois insertion (e,h).

    e(W ). In particular, embeddings are injective and order-preserving. A Galois insertion ofSΩ in SΣ is a set of two maps (e,h), where e : SΩ 7→ SΣ and h : SΣ 7→ SΩ, such thath ◦ e is the identity in SΩ and e(V ) ⊆ Z ⇔ V ⊆ h(Z).

    Definition 2.10 (Embeddings). Let SΩ and SΣ be two specification spaces. A functione : SΩ → SΣ is a specification embedding if it is an order embedding and a specificationhomomorphism. In this case we say that SΩ is embedded in SΣ, and we may call SΩ

    the reduced specification space. We denote the embedding of SΩ in SΣ by e(SΩ) :={e(V )}V ∈SΩ.

    A specification embedding is extensive if |e({ω})| = 1 for all ω ∈ Ω, and intensive ifthere exists an adjoint homomorphism h : SΣ → SΩ such that (e,h) is a Galois insertion.

    Example 2.12 (Animal embeddings). As an example of extensive embeddings, sup-pose that you discover a new animal species, Saltuarius eximius. T