What’s New in Conan 2.0 @ CPPCon 2022

January 20, 2023

2 min read

During the 4+ years since Conan 1.0 was released, we have continued to learn from the C++ ecosystem as we watched it grow; learning many lessons, challenges and trends in the industry from the feedback from tens of thousands of conversations with users and customers, including many of the largest C++ related companies in the world. This talk summarizes some of these lessons, and how they have been used to create the new major version of Conan. For example, while many advocate for header-only or always static linkage, the usage of shared libraries and especially shared libraries that link and embed other static libraries is a very common pattern in the industry. Conan 2 implements “requirement traits” that go beyond CMake private/public usage requirements, to allow specifying complex dependency graphs, including bootstrapping, cross-building with tools, private dependencies and much more.

Many teams produce libraries, applications, SDKs or other artifacts that need to be deployed or consumed by other teams with different technologies (like Java, Node, etc) and not using Conan. Conan 2 new “deployers”, which can be user defined, allow easy extraction of artifacts, as well as automation to create Debian packages, Windows installers or any other type of derived artifact.

There are many enterprise, very large C and C++, projects that need to manage binaries (Bills of Materials), not only to reduce build times, but also for traceability, security, industry policies, etc. Creating and managing binaries at scale is still a big challenge. Conan 2 provides much simpler “lockfiles” to allow full reproducibility of the dependency graph, a new custom user defined global binary compatibility mechanism, and a more accurate computation of what package binaries need to be rebuilt when some dependency changes.

Finally, we learned that package management is yet another piece in a larger DevOps, Continuous Integration, automation, and developer experience puzzle, and many users have been building different layers of automation over Conan. So Conan 2 releases a new public Python API (around 65% of users are already using Python for automation of C and C++), and new custom defined modular commands to extend the Conan CLI.


Diego Rodriguez-Losada Gonzalez

    Diego Rodriguez-Losada Gonzalez

    Diego Rodriguez

    Diego Rodriguez-Losada Gonzalez 's passions are robotics and SW engineering and development. He has developed many years in C and C ++ in the Industrial, Robotics and AI fields. Diego was also a University (tenure track) professor and robotics researcher for 8 years, till 2012, when he quit academia to try to build a C / C ++ dependency manager and co-founded a startup. Since then he mostly develops in Python. Diego is a conan.io C / C ++ package manager co-creator and maintainer, now working at JFrog as senior SW engineer and C / C ++ advocate.

    Video Transcript

    hello um good afternoon my name is Daryl I am one of the Conan co-founders and today

    I’m talking about what’s new in connect to zero spoiler everything is new in connection

    around 80 of the code base is completely new okay and that’s includes 20 that we have

    back ported to Quantum one to make a available a subset of the syntax so we

    will allow an easy migration from recipes to column one from column one to Conan two okay we have basically been five years

    without breaking and that major release includes all the all these major changes

    why are we changing that much because in these five years we have had tons of

    feedback content is widely used it’s very widely used for example in the CPP language lag the Conan channel is ranked

    in the most used channels in the whole slack this is from last month this was the second most used Channel by the

    number of different users using the channel Conan gets around 600 000 downloads per

    month from the python package index we have also other other releases like installers or something not content here

    we have been designated as a critical project in the pi in the python package index that means we are in the uh one

    percent of the most downloaded packages in the whole index and we of course we also again uh tons

    of feedback from the users last year we got more than 4 000 pull requests to the

    GitHub repos that we maintain and of course we get the pull request

    and we also get these last year also we responded around 2 000 GitHub resource

    it’s our main support Channel but we also did a bunch of video calls with

    users and customers this is something that we do know on a weekly basis very often we also do tons of direct support

    with users typically vs lag and we also because as we are part of the afrog we

    also know how many people are using Conan because we have the Telemetry for the artifactory servers that we know

    there are at least around 8 000 teams companies okay last year that they were using

    Conan in production because they had their their servers live and with Conan calls been been used

    uh the feedback was actually so so large that we did a working group and we

    gathered together around 70 people from from different companies like some of them there and and it is a feedback

    group we have been iterating proposals for Conan to zero and we have been getting instructor feedback from them

    um and yeah is the I want to say thank you very much for this for we call it the tribe the tribe to Silo because

    without them the contents video would be impossible so today my talk uh I’ve structured it

    in Five Lessons five important lessons we have learned from the ecosystem during this time and how Conan 2 is

    addressing this these concerns and so let’s start with the first uh the

    first one Learning to Fly because we started with a relatively simple approach we started step by step slowly

    the first thing we did was create recipes we call the Conan five.py we

    call them a recipe and they contain the the instructions the methods how to build the packets from sources like the

    source method the build method the packets method okay and then of course the next step after defining how something is built

    from sources is to define the dependencies because we are building a dependency and packets manager

    then we introduce the requires attribute recipes in this case we have this game

    application which is um yeah is simulating a small video game that to

    build the game application we depend on an engine engine package and we express

    that with a requires equal engine in the column file of the game package

    and likewise engine depends on the math library on the math packet and it is expressed it requires to the math

    package with a request close so when the developer of the game needs to work on the game the first

    thing that it will do is icon and install they do account and install and they get the dependencies their dependencies engine and their transitive

    dependencies and the second thing they get is the content generates for them

    files so their build system can locate those dependencies in this case it will

    generate two files for cmake okay I’m using cmake in all these examples but if

    you don’t know Conan integrates with any build System including Ms build auto tools Mason we have interaction with all

    of them native and you can also use your system but I’m using cmake for this stock because I think URL no cmic so in

    this case column will generate two SEMA files that will help the game to locate those two dependencies and to link

    against them but then we started to learn from the ecosystem and yeah we thought that a

    static linkage starting linkage was good enough for most cases no it is not there

    are tons of users out there approximately I think half of the users in company they are using shared

    libraries in some place or others and they were proposing us cases like this

    one hey we have a game but we have two releases one of them is used in engine accessory library and the other one is

    using uh engine as a static library in both cases we can assume that math is going to be a static Library

    okay but in both cases Conan you are give me the math file you are give me

    you are giving me the math Target always but I don’t need to link with a mass Target always because when I’m using

    game with the engine as a search Library the math library is an implementation detail of the engine I don’t have direct

    visibility to it so I don’t want to link with the mat Library so please don’t generate for us this

    file so basically for us at that stage was asking us Learn to Fly

    we couldn’t do that we didn’t we don’t know how to do that because we didn’t have a model to represent that we had a

    very naive requires equal something definition

    so we decided to come come up with something new and this is the new proposal for connect to zero as you can

    see it’s identical to the to the quantum one but the trick of course is that we

    enable something else optional argument that UConn provides to the requires

    in this case we are expressing here that the engine package requires both the

    headers and the Diaries from the math library from the math package okay this

    translates when the engine developers are working on the engine that translates that the math config cmic

    file is generated and it will have information both from the headers and for the libraries

    but now the developer can start expressing what they want hey I know for

    some reasons the engine doesn’t use the headers at all it has some I don’t know implicit definition of the of the

    interfaces so I know that that the engine will not use the headers so they can put the header straights to false

    and then that means that they will have a file that has removed the input directories properties in the in the

    config files and in a similar way they can say hey I don’t depend on the libraries for some

    reason I’m only using a header file from the math package and please don’t don’t

    give me your libraries because I won’t use them I don’t want to link with them so they can put the leaves straight to

    false and that means that the generated file will will remove the the link libraries properties of the targets

    the interesting part of this approach is that it can preparate we can Define some

    rules in the in which these traits propagate from the Upstream consumers to the downstream consumers in this case

    when we have the game game depends on engine and it depends on math that means that somehow

    game depends on math too indirectly something so we call that a transitive requirements is The Orange

    Box it is similar as if game Hatter requires to the math to the math package

    and the interesting part here is how these trades propagate from the Upstream

    to the downstream let’s say that we know that the engine is always a shared Library

    so the developer of the engine package can now explicitly say I don’t want to propagate the library

    requirements to the consumers that means transitive lips equal false

    that propagates to the downstream consumer to game and then the trades that games get get to the math package

    is headers false and Library faults that means a game doesn’t actually depends on

    math when engine is assert Library knows those traits it can Define for the

    game developer it can define a math config or symmetric that is basically empty Conan in practice can remove it

    okay but the idea is that game will no longer get linkage requirements to math static library because you have an

    assert library in the middle this seems a bit cumbersome because now

    it seems that the developers of the recipes of the Quran recipes should be specified in the trades all the time and

    that can be a bit too much the thing is that it’s not is most of the time most of the times is

    not necessary at all because these traits can be deduced from the package types we have we have that information

    in recipes in some of them because we have the options the options the the standard one from being a search library

    or aesthetic library is called the third option okay so if we have this option based on the value we can deduce if we

    are a static or a set library and for those cases that we don’t have this information we have to introduced

    content to zero introduces the new package type that can explicitly say if something is a static Library assert

    Library a header only library or an application for example

    let’s see in practice as I’m doing a talk about our tool I always like doing live demos I know it’s

    a bit risky so let’s cross fingers and I I’ve had built here exactly the same

    example that I’m showing in the slides I have the game package that depends on the engine that depends on the math okay

    so I now I am a game developer and I’m going to work on the on the game

    on the executable right so the first thing I need to do is to install my dependencies

    and I’m going to install my dependencies and I’m going to work on the executable that uses the engine as a search Library

    so I specify that as this engine 1.0

    third equal true so corner will install my dependencies

    for me now I have my protection app I could open cmake create my project and

    start building but what I wanted to show you are two things first one

    math here we can see in the dependencies Conor is telling us math binary can be

    skipped why because it already has an engine package and it’s already in the

    cache and as it is a shared library and it’s already compiled you don’t need the math binaries at all so Corners knows

    that this binary can be escaped it doesn’t need to be downloaded from the server at all

    and the second interesting part is if we check the files that Conan

    generated for us we see that we will have some cmic files here but we don’t

    we will not have math config files at all because we don’t need them if we inspect the data that we

    have in the engine in the engine CMC files we see that dependencies are known it’s an empty set because for the

    perspective of the game developer Matt doesn’t exist it’s already an implementation detail of the game search

    Library but if now we try to

    do account install with the default which the default by the voice is static

    libraries now connect will tell me hey you need the engine package in the cache and you

    need the math package in the gas as well because you are going to link with both the static libraries

    and if we check the generated files that Conan generators for us we will see that

    it now represents the dependency the transitive dependency between engine and math it is here

    and if we expect the math files that now the math files have been generated because now I need to link with the

    metastatic Library we will see that we have the library defined here we have the path to the

    library here and also interestingly we have that the include directories to the math package are empty why because if

    you want to use the headers of math in game you need to put a requires explicit requires from game to math if you don’t

    put that that’s a transitive headers and by default transitive header should be hidden when we are talking about

    packages that’s software architecture are scale

    okay so with this uh we managed to have a much better representation of the of the graph and the requires in Grand to

    zero which correct linkage requirements with correct header visibility we also

    allow the positive that we know is not great but there are many users out there that have different versions of the same

    package hidden insert libraries and using the same product at the same time that is most expected or you can find

    them more often that you you would like and many other things I’ve been talking today about three trades there are

    actually around eight trades so if you want to know more about the new trades I would recommend that the talk that I did

    in Accu in April because it’s dedicated only to this part of the of the of the

    talk the cool thing about this is this is very similar to what cmake or other systems are doing we are propagating the

    linkage requirements and the health requirements but we are doing that among different build systems we can have a

    dependency graph and we can have five different build systems that is still that their requirements will be

    correctly propagated the older will have behind scene Works among different build systems and of course we need a

    compatible syntax because we need to be able to move from Kona one to Conan two

    second lesson learned when we are talking about developers we can assume that cats are a beautiful right

    or maybe maybe in our case it is not what I’m talking about I’m talking about

    the binary model and a big mistake that we that we did when we’re talking about binaries this is what we are doing we

    have a recipe and for every different configuration that we are building we can create any number of binaries

    every binary is identified by a unique Target ID this package ID is a half it’s

    the hash of the configuration something like this that means that if we change something

    in this configuration we change a library within a static or certain we get a different configuration and we get

    a different package ID because the binary is different or if we change the architecture the same different configuration different package ID

    because it’s a different binary this model this is a simplified View

    the full view is taken into account also the dependencies because if you are linking with dependencies your

    dependencies also affect your binaries in this case having exactly the same compiler settings architecture options

    and everything if you change your dependencies and you use different versions of the dependencies your final binaries are going to be different

    how different does the question that we try to use with sember and we say okay

    we are talking about versions right right December can be wrong that’s the default of the of the industry right so

    we decided to use chamber and our interpretation of December means exactly this we have these two dependency graphs

    okay in one case we are using math 1.0 and the other case we are using math 1.3

    okay that can happen because because both satisfies the range in the engine

    is a range between one and two so both are perfectly valid requirements for engine

    but for a package ID perspective for a binary perspective we understood

    December specification as breaking the major is what a new binary means

    so we say okay both things are symbol compatible because changing

    the minor is kind of simple compatible right it doesn’t break so we assume it was the the right approach that means in

    both cases the engine binary will be exactly the same it should be exactly the same

    okay and if we wanted to have two different binaries for the engine package for the two different versions

    We would need to do a major bump of the math package okay in this case if we if

    we have the math 1.0 and math 2.0 to those what what they will actually

    create a new expression a new expression that we will fit into the configuration

    and that will result in a different package ID but if we had the miner only the

    equivalent expression of one zero and one three would be math one dot y dot

    set in both cases which result in the same configuration which result in the same package ID

    so the thing is come on when’s a project our C plus plus bumped the major reason

    for example boost has never I mean in the last year has bumped the major reason and the thing is that in the C

    plus plus ecosystem we understand sember a bit differently so for us the first

    version is kind of a generation thing very challenging and when this changes is completely destroying everything and

    breaking changes are being done at the minor typically so boost release is

    breaking every release is breaking and and and that they only bump the the

    minor version but the most important thing here is the native compilation model when we are linking a shared

    library with a static Library this is more or less what this happens we have here the the native code for Ruby for

    Windows and when we build the the engine dll search Library

    it basically gets a copy of what is in the math static library and it embeds

    the native code of the math static Library it embeds the code into the engine search library of course

    optimization can happen this is I think this is a debug build okay but the main idea is that it is actually embedded

    making a copy of it in the in the engine artifacts this is something that doesn’t happen in other package managers so if

    we’re talking about python uh Maven that the the the binary or or the package

    model that they have does is not native so chamber works for them but in our

    case it’s a bit different and that means that we had to go with a new package ID model that handles two main use cases

    the first the first use case is the non-embed which happens for example with

    when an application is used in assert Library okay in this case the search library is

    not embedded in the executable right if assert Library uses another search library or if a static Library

    uses another static library in all of these cases this embedding of the native code doesn’t happen

    for this case we selected the four corner to zero the minor mode meaning

    that the developer they have to control now if you bump the minor version that means that you get a different binary

    but if you only bump the patch version which is a pattern implementation detail that means the binary ID the package ID

    will not change that means you don’t need to rebuild we are going to see this in practice now

    and on the other hand if we are managing the the embed case which happens with us

    an application executable is linked in a static library or when an application is linked in a header library or where the

    search library is linked in either a static library or our headline library in all of these cases this embedding of

    the native code is happening this inlining is happening okay for those cases that means that no matter what

    changed in your dependency you need to rebuild you need a new binary of the consumer so what we did is we took the

    full reference of the dependency that includes the full version the revision which is an implicit version that we

    manage and also the package ID itself because if the binary changes somehow you still need to recompile you need to

    rebuild that answering consumer and of course all of this you can

    configure that now but the most important part is is this dual model let’s see it in practice

    when we are developing this game and sometimes the math the math team will

    release a new version so let’s see what happens Conan create

    math and they are going to release Apache version in this case they did a

    minor bug in some of the CPP files and implementation details so they fix the

    bug and they create a patch version 1.0.1

    so the key question here is how this affect to my vanish let me put it otherwise what do I need to rebuild

    okay so now I could be trying to build the game

    but I’m going to show you a new content command which is the current graph

    build order for the game

    and tell me which binaries are missing so Conan will analyze the dependency

    graph it knows that the math it now it now has a new version that

    fits in the range and a this is an implementation in detail the engine static Library

    doesn’t need to be rebuilt you can reuse the previous binary that you had because we are assuming that you don’t need to

    re-link is if the change was done in the math CPP files you don’t need to rebuild

    the the engine the engine library but in this case we need to rebuild the game

    the game is suitable should be rebuilt because it needs to relink with the changes that happen in the math in the

    math Library okay and likewise even now we ask for the bill

    order which binaries are missing for game if I specify the

    engine as a search Library

    now it will tell me okay the method you have it here it was created you need to build the engine now the engine search

    Library must be built because there was a change Upstream that affects you you got a new version you need to do a

    rebuild but once you rebuild the engine insert Library the game application doesn’t need to be rebuilt

    okay because it’s linked in a search Library the search Library will embed the changes of the math and it will not

    really need every link of the game is clearable again as you can imagine this is uh

    simple at this scale but when we are talking we learned that our users they have dependency graphs about typical

    thing is 100 200 packages for example so it starts to get big and now we have users that are approaching the 1000

    packets mark so for them it’s very important to know when something changes in the middle of the graph what needs to

    be rebuilt in an optimal way Downstream that dependency graph of course this can

    be like a bit weird there on the on the command line but

    of course this is intended for automation if we check the Json output of this command we see that the Json

    output of this command will be retrieved me a list of list of things that need to be rebuilt so I can take this input and

    I can parallelize all the all the things that need to be revealed at the scale because

    Conan will tell me what needs to be rebuilt and in which order and how you can parallelize them

    okay let’s clean a little bit

    okay so finally the is not a cat is not a dog the model that we need for for

    binary representation in CNC plus native building if you want to is a bit more

    complicated it’s something like hybrid and there is a hybrid we that we need to differentiate between the embedded mode

    that we need to use a if you change anything in your dependency you need to rebuild the downstream consumer of that

    dependencies and in the case that is not embed we have full control we have decided for a policy that allows

    developers to explicitly say if the consumers needs to be rebuilt or not depending on the changes and of course

    users can can customize this Behavior as well and yeah this will be a major

    enablement for a scale so these things were like related to the graph and how the Banners are computed

    another important lesson that we learned is that there are more people than developers okay we have a very developer

    Centric View and we were concerned about just the packages and getting my application running but we also started

    to learn that many of these companies they also have full devops devops teams

    that they need to get those things that you build and there are inside the column package and they want to put them

    as deviant packages to distribute to to server for example or they need to I

    don’t know to create a ship for their for their users for their customers because they are not going to distribute Quantum packages

    and also we learned that them ecosystem is extremely opinionated as you can as

    you probably know more than other languages and this is the typical conversation we we had users that they

    wanted to have their binaries not in the Conan cash Conan when you install something it put it in a place which is

    a structure and it can manage it very well but developers were asking hey I want those binaries in my project and we

    were trying to Hey There is no technical advantage to that actually there are only only disadvantages and the

    conversation finished I say okay I want the banners in my project so as the project the problem is basically the

    same is someone wants the binaries the artifacts that are inside column packages and they want them out they

    want them in a user folder they want to automate something with with those artifacts outside Quantum packages

    so then for them we developed in two zero the deployers concept deployers is

    an external script it’s a python script that implements exactly this logic how

    to extract things from Quantum packages and do something with them there are actually several different levels you

    can have these deployers in somewhere in your folders you can have them and you

    can install automatically to all your developers and CI machines with a single

    cone and config install command okay and finally we also have some built-in deployers

    let’s have a look at this okay I’m going to show you first the the

    full deploy built in which is the the one that implements this case of the opinionated C plus developer that they

    want the dependencies in the project so if I’m developing the

    the game I’m on the developer of the game now I can specify that I want to use this deployer

    okay apparently everything is the same but if we check our project now

    we will see that we have a copy of the binaries we have for the engine

    we will have the header of the engine we will have the libraries here

    and for math we also have the headers and the libraries and everything is

    inside a copy of my project furthermore something that developers really wanted

    if we check the cmake files that were generated those cmake files are now pointing no to

    the command cache but to the deployed copy in my current project that means

    that I just achieved a folder here that contains absolutely everything I could remove Conan for my

    for my system and with just the same files that I have here in my folder and the local copy of those artifacts I

    could build my my application with the dependencies Also let’s go now with the case of the

    of the devops engineer that they need to create something they want to create a

    bundle for their for their customers for example so all they need to do

    is to rhyme to rat to sorry to write their own deployer and the player looks

    like this in this case this deployer is basically iterating all the dependencies

    for every dependency is looking into the the binary folder

    if it finds in the binary folder a dll or another executable it will do a copy of that

    artifact into the current folder okay and it will store that in a list

    and when it has finalized copying all the access and all the dlls

    it will tip them in active file and removing the the copy file that means

    that if the deployer the devops want to get the final product for their customers all they want to do is call

    and install I want to deploy this

    version of the game and I’m going to use the Deep sorry I

    want explicit to deploy the one that is using engine as a cell library

    and I’m going to deploy it with my custom deployer here

    okay Conan that’s his thing and we can check that now we have this runtime ship

    here and inside the runtime zip we will see that we have the game executable and we have the dll that can be shipped to

    the customer

    okay so with the other players we learned that there are other people besides developers and companies and

    they want to automate also some tasks so we created these deployers as a flexible way to extract artifacts from the cache

    to automate some some postcode has created deviant packages or other kind

    of installers is important because in Conan one this functionality was embedded in recipes

    something equivalent it was called the Imports and the deployed methods and recipes but that doesn’t scale because

    typically you want to have the deploy functionality and you want to apply it to many different packages out there so

    we extracted the functionality of the deploy into its own dedicated file and finally you can manage these deployers

    with a single command you can put them in a repo and then do connect config install the repo and it will install the

    players of your company or your team into your current installation and you will be able to manage them

    okay so then we learn the fourth lesson is the importance of repeating yourself

    why because we learned that companies they need to reproduce the exact same build today and in 10 years for example

    the medical sector is absolutely insane they also need to keep the hardware copies of the hardware so they can

    business exactly the same thing and this also applies to dependencies because dependencies change they change they

    know the not only new versions of the dependencies but sometimes also something someone fixes one dependency

    without bumping the the version which we call it a revision so some sometimes things will change okay but still there

    is this need of repeat things this is the problem that we want to solve here basically the developer of the game at

    some point data couldn’t install got the engine one and the math one dot o

    is developing everything is Happy everything works perfect and then it happens on the math team

    they decided to release a new 1.1 version okay the day after the game developer

    comes into their machine they do the corner install and because of the math package

    satisfies the range now that dependencies will be different and if there are a new bug in the math or there

    is something that changes Behavior they will be great what is happening here I didn’t change anything I didn’t change

    my requirements because I I still require engine 1.0 but still I’m seeing

    a different Behavior how can I get rid of this and this is the problem that the log files solved in this case the game

    developer can say hey I installed my dependencies but I captured a log file

    that will look something like this okay a log file is basically something

    that captures the versions of the dependencies okay so no matter if the math team

    decides to release a new version the way after the game developer gets to the office does a gonna install they provide

    the the log file as an input and they will manage to get exactly the same dependency graph

    okay this is the concept that exists in other packets managers you have them in npm for example okay and Conan one

    implemented log files okay the only thing is that they were so challenging why because the

    the dependency graph I think I will talk about this later uh let me show you a

    little bit about the importance of the log funds in the last two and a half years like 10 of the issues in GitHub

    are related to log funds or they mentioned log files somehow because at

    some point uh when you are managing dependencies you have to take a decision you either explicitly pin all the

    revisions in your requires that means that when a new version is released of some of your dependencies and you want

    to start using you need to go to the projects you need to modify them and explicitly change your requires to point

    to the new release version okay this is you have full control but this can be also cumbersome and slow

    so in those cases there are people that are developers that they want to go faster and the way to go faster is to use

    version ranges for example let me a I accept any version in this range but

    when you start to use version ranges you start to move faster and you move faster things can break you either need to be

    ready to move forward very aggressively that means when something breaks you go quickly and you fix it moving forward

    all the time or if you need some kind of balance you need to capture the log files to be able to reproduce something

    because otherwise debugging or maintaining things is a is a nightmare we estimate that around 25 to 40 percent

    of users are using log files nowadays and the actual demand for log files is

    is huge we’ve seen it in other ecosystem like npm finally maintaining your log files is very necessary in in many cases

    so the thing that we learn also is okay this is the simple

    case of of log files log files in Quantum one they managed to implement this they manage the main problem is

    that you have to capture One log file per configuration and that was a major challenge why it

    happened on the dependency graph resolution in Conan one was not fully deterministic so we need to capture the

    full dependency graph in the log file and then that’s that’s the dependency graph can change for different

    configurations because there are conditional requirements out there then you need one log file per different

    binary okay but this was still doable in Quantum one this is what I’m going to show here was

    almost impossible okay and it’s the case that a the math team release math 1.1 okay but we don’t

    want it we have login to math 1.0 we are not going to use that crazy release of the math team they don’t know what

    they’re doing it’s plenty of times but we need to release a new version of the game there was something majored in

    the engine and now the engine team which are my my friends they are releasing engine 1.1 and I want to test it I want

    to release game again using the engine 1.1 but it should be used in

    still the math 1.0 version because I don’t want to pull the the the new one so I want this

    how can I achieve this if the log file is telling me hey you are locking math and you are looking

    engine to version 1.0 but I want to change engine but I don’t want to change map but how do you know it’s very

    complicated so this is the problem that we learned from the from the ecosystem that the

    Enterprise they always have a scale is a challenge why because I told you the

    typical the typical dependency graphs are around 200 packages but the thing is

    that they also have 200 developers and their developers are doing pull requests

    to different packages at the same time and the only way to keep sanity is that every dose of pull requests needs to be

    tested in isolation of the other changes of the other new versions that the other teams are releasing because otherwise

    you can build a pull request and in the middle of the pull request a new version comes up and the first bill for Linux is

    using math 1.000 and the second build in Windows is pulling method 1.1 if you are

    not actually looking to math 1.0 understanding isolation so these

    continuous integration at scale is something critical for the Enterprise okay I like to think in uh in the same

    way that a programming over time is software engineering it happens that Conan is going Beyond

    package management we learned that they finally the users were asking for this so we developed this functionality for

    them so we we understand that packets and dependency management over time what

    happens when a dependency change over time does more or less what the devops buzzword is about okay and we are

    starting according to zero is incorporating tools to serve the devops community that are handling they are

    starting to use devops for C plus plus with Conan so as I told you the major

    contribution of the log facts in corner to zero is that they um instead of having the

    full dependency graph capturing a log file they managed to put the log dependencies with reversions in just a

    simple list and this is a game changer this simplifies a lot of things and this enables the power to do exactly

    what we what we want let’s let’s try to to solve it

    okay let’s go again to the game I’m the game developer the use case that I told you I

    can do account and install and I’m going to capture my log file

    and I’m going to call it gaming.log this is a for for demo

    purposes if we don’t if we name it conan.log is going to be used automatically okay but I’m going to put

    explicitly the commands there so it’s everything what we are doing okay so this will be captured in the log

    files and we can actually have a look and we can see that this is this simple list so this is

    something that that users can read and can understand okay so if it happens that the crazy math

    team decided to release um

    corner create math and

    the 1.1 version okay and we don’t want to use it

    it will be relatively simple because now the game developer

    can say hey I want to use now as an input in the previous command it was the output I was creating the log file now

    as an input I can provide the game DOT log and that will guarantee

    that I still I’m still using a game one the sorry math 1.0

    okay now what I’m going to try to show you I know is a bit complicated uh but

    let me tell you this was a training that we had prepared in Conan one that was a three hours training for trying to solve

    what I’m trying to solve now in about one minute which is the problem of of what I told you we now are going to be

    the engine team so I’m moving to the

    sorry I’m moving to the engine

    okay now I’m going to do a corner and create I’m I am doing some great

    improvements to the engine create with a new version 1.1

    and as of course we don’t want to pull the them the crazy changes from math 1.1

    I’m going to provide the game log file okay the locker that I created

    log file

    and I’m going to create a new one okay because I’m doing a change okay the previous log file contain engine 1.0 and

    math 1.0 but I’m doing a change so I want a result in log file I’m going to

    do a log file out equal

    so the first thing this will guarantee that we are using math 1.0

    it will create the engine changes and if we expect

    the new log file we will now interestingly see that we have two versions of engine 1.1 and 1.0

    no problem for this now we go back to the game developer and we want to

    test the thing that they get that the engine team did so we do a con and create

    and we provide the log file

    that the game Sorry that the engine team did and this

    will guarantee that it finished

    this will guarantee that we will be using the engine 1.1 but we will still

    lock to the math 1.0 and if we check

    the output of this execution of this we will realize effectively we are pulling

    the changes from the engine 1.0 but the math targets is still math 1.2

    sorry is is the time correct oh so that that is broken

    okay okay thank you okay so log files log files into zero

    are a major is a game changer for contingency at a scale it enables uh in

    an easy way in a in a way that developers can understand because those lists of of logs requirements at least

    you can merge them you can merge log files from different products if you want to you can add manually versions

    there if you want to just to force the usage of a specific version in your in

    your dependency graph huh okay the last lesson

    you are an open source maintainer and then you get start to get a lot of feature requests

    feature request like a um can you do please I couldn’t install but after that uh I want to uh or call

    and create but could Conan remove automatically the temporary build folders please can we add a

    configuration that will remove automatically the temporary build folders for me or I want gonna install

    to automatically install debug and release configurations in just one command so please add the syntax to the

    command line so I can do that in one line instead of two Corner installs with

    different arguments or can we do a command create and then with a configuration or with a command line

    argument to do the upload to the server just after the content create and then

    you realize that if you start doing those features the complexity goes out

    of the sky the command line is really cluttered so it’s certainly something that you cannot do it’s impossible

    because it will pollute your CLI in a in a great way but what we learned from the

    community is that approximately 50 of the big teams using

    gonan they are creating their own layers mostly in Python of automation on top of corner okay so it seems that they didn’t

    want the tool of course they want they want column they needed they wanted a framework to be able to develop on top

    of that framework okay and of course we this is exactly what we didn’t go down

    to we created a framework that is based on user custom commands so basically those Python scripts that you are

    already using you can transfer them to command commands with a specific syntax

    those commands can be installed in your Conan clients for your developers and

    for your CI with just the same Conant config install command that we are using for profiles setting and deployers and

    everything else foreign code the most important thing about the

    about the user commands is that now they will be able to build on top of a public

    undocumented API because we had in common one we had this API which is basically a parallel of the commands the

    command line commands they map each one one to one to an API

    in Quantum 2 we are creating a full-blown python API and you have details for example for the upload API

    you have all the detailed steps that happen before an upload how you are shipping the artifacts how you are

    checking what needs to be uploaded because it exists on the server or not you have full control over all those

    steps in the upload in the installs and everything so you can build on you can actually build on top of that instead of

    just the high level command line interface huh

    so this is how the commands look look like this is a very simple command hello

    we mark them with an annotation here and you can have commands you can have

    nested commands if you want to we call them Zoo commands it implements for

    matters so if you want to have a stronger format different output formats of your commands you can implement it as

    well you have access to a nice colored output two and of course to the to the API that

    we will see and the cool thing is that these commands you can just install them with Conan config

    installed here you could you could use a URL like a git Ripple URL for example

    and it will fetch those things from the git and put it in your in your constellation in this case for the demo

    I’m using just the local folder and these will be copying those commands that I have there in my folder and put

    them in Conan that means that Conan help will show you these commands I see that

    I have here this nice hello command of course I could type Conan J frog

    hello and it will have a nice command line help I could execute the command will which

    print the hello world or I could use the command plus the output format which

    will be Json okay if we want to do something more useful

    let’s say let’s implement the case of users wanted to do this they have a repo

    with a bunch of recipes and as Conan Center index like a style okay and they

    want to export all the recipes one after the other okay so they have asked for the quantum export command to do that

    automatically okay and we didn’t do that but now with this you can do it in in just these very few lines you take the

    command line argument to the path of the Ripple that you want to export you check

    all the folders inside there if you find icon and file.py inside the folder you

    call the Conan export API and finally to capture the result of

    that API in the final result and this is all you have to do this implements

    the Conan Conan J frog

    multi-export to my current yeah I have sorry I didn’t show you I have here a repo with a

    couple packages Ai and physics so I can point here to the repo


    and it will go iterate all the Ripple exporting each one and I captured the

    the result of this command so if I want to see it I can use the format equal Json

    it will do the export of every packets there and it will return me with

    a Json list of all the recipe revisions that has been exported I can use this output to automate further into my into

    my flow

    okay so these custom commands uh we are very excited about them too because they

    are very powerful extension that is convenient for both developers and and CI it will build on top of a python API

    with real building blocks and it’s super convenient to to be shared and installed

    and unmanaged for your developers it can even be version it because you can put in a git repo in a certain Branch you

    can put in a numbered a version C file out there

    okay so my conclusions um this was our Conan one zero uh

    release a bit Barbarian okay and in these five years we have learned a lot of things and these are the major

    learnings that we did the first one the graph model was not enough to represent

    where all the all the special things that we need in CNC plus building uh we need to come up with a better graph

    representation a better requirement traits and package types to represent how things should behave in a dependency

    graph for CNC plus packages we also had to improve the binary model because

    December was clearly not not a good default and we came up with a new model that takes into account when the

    embedding and in line in Native binaries happen and wait when they don’t we also handle the the request from the

    dev of people that they needed to take out the Conan artifacts out from Conan and deploy them in other ways with the

    in other places with the deployers we also solve the major pain of the log

    files and continuous integration of the scale and we remove the complexity of the

    Conan One log files with a much better approach that allows now to test every

    change in isolation for a full dependency graph a huge dependent graph with tons of developers doing changes at

    the same time and finally we created a framework for maintaining the run to

    automation your own custom commands and your based on the python API

    this is not everything this is like the five biggest lessons that we learned actually as I told you connect to zero

    is quite big so there are many other things that are important to see that I haven’t talked about uh like a cache

    that can handle multi-revisions package signing with data plugin and we are collaborating with six store to have

    packet signing also and other things I’m very excited about the custom binary

    compatibility we added a plugin mechanism that you can explicitly Define the binary compatibility you want for

    example between compiler versions if you know that hey my C my PRC packages they

    have binary compatibility between minor GCC 4.1 4.2 or whatever versions you can

    encode that in a plugin that you own that you maintain that you install with current config installed okay and that

    will allow you to be way more efficient in the number of binaries for example that you create when you upgrade compilers

    and many other optimizations that we not within current zero so on my conclusion current zero is

    going to be a game changer and everything that I’ve been yeah I’ve

    been showing today is already released this is the beta we’ll be doing two betas more okay and around two three

    depends on how it goes and then we will go ga so if you are using Corner one today we recommend please start using

    trying to Conan beta and give feedback and if you’re not think around today you are probably missing out you should be

    starting and please start with the two zero because this is the next thing change if you have any question also go

    to the content.io you have links there to our GitHub to our email and everything and don’t hesitate to contact

    us for any for any question regarding equivalent and thank you very much

    and if you have any questions please do you have the make there [Applause]

    thank you um is it possible to propagate compiled flag from my project to all the

    dependency like if I want to enable sanitizer is it possible that I set in my project and

    automatically got propagated tool dependencies well you can propagate propagation

    happening in two different directions so you can have a package for example that defines in the we call it the CPP info

    you can Define the cxx flex there and you can inject flags that will get propagated Downstream those flags follow

    the same rules that we have seen in the trades so if you put some CSX Flags in some some dependency

    they will be injected to the consumers those cxx flags will in the case of the shared Library they will be applied to

    build the ngsr library but they will not be propagating down to the consumer to the final application to the game they

    will stop there but typically the case that you are talking about sanitizer sanitizer is something that in most

    cases you need the whole dependency graph to be built with a scientists enabled so this is more like a setting

    you want to model that as a setting custom setting that you can inject in your settings.yaml and as a I don’t know

    maybe a subsetting of the compiler that will automatically inject it in all the recipes because everything should be

    built with the with the scientists configuration enabled right so more than a CX effect that you want to propagate

    is more an input the same way you specify the compiler version you want to specify compiler version and I want to

    use these sanitizers or designated Flags so we added some in the profile you can specify now

    flag it needs to be like specifying something no we also add in the new

    configuration is a tools.bill colon cxx Flex in your configuration section of

    your profile and that will be injected in all the in all the dependency graph

    and all of this is going to be part of the ash of the package like all the computers like all the compile flag

    you have control over that so by default configuration items like this cxx Flex

    they are not be they are not put into your into your package ID but you have another configuration that you can tell

    a i1 my cxx Flex to be part of the package area automatically and did you have when I told you you have full

    control over the package ID is that you have configured you can tell it hey I have this configuration that I’m

    changing I want this to be part of the package thank you you’re welcome

    thank you for the chat that looks like very exciting changes from Conan one to Canon 2. I have two questions actually

    one was about what you just what you mentioned with the traits for for the requirements like I want the headers or

    not and that controls what goes into the cmake package package config file uh how

    does this handle conflicts if multiple of libraries in my in my dependency tree

    depend on something and some of them want the headers on those and stuff like that that that’s a big language if you

    have realized the trades are bull out of type bull right instead of going with

    cmake a public interface private we decided to go with bull because there is

    another fourth case besides those three you need to also handle the case that you don’t depend on anything

    okay which which uh cmake public interface Canada so we made them

    booleans because at the end of the day it’s a mask so when you have diamonds and you have conflicts in your

    dependency graph not conflict but diamonds in that case if you have different traits we have given the

    trades a semantics that they are additive so basically if you have a trade there that you need the headers of

    this and you want these headers to be propagated no matter what the other Branch says because the final user will

    get the headers of that because one branch said that they wanted the headers that happen the same for the libraries

    so all the trades are built with this semantics that they are additive so we can handle the dependency graph

    propagation in that sense okay thank you good and my other question uh comes from my prior experience working with Connor

    it may be one of the Thousand bytes

    when running con and create sometimes I found myself needing some sort of build-time

    real-time configuration for the package like here is a path to something or something like that if that and if it’s

    done as an option it will propagate into the package ID and Etc so is there a way to pass some let’s

    say creation time arguments uh well the mechanism for providing a

    custom user input now is the new configuration so the configuration is both for for built-in configuration that

    we have but there are also user dot whatever user configuration that you can manage and you can read those

    configurations your recipes to do whatever you want and you basically you have full code we

    have gone from trying to be a smarter Conan one to a the developer the user

    controls everything and they have full control with the toggles and everything in this case the configuration

    we have enabled user user type configuration

    thank you excellent

    thank you so back to the hair propagation and Library

    propagation concept um when you specify let’s say headers equals false in your

    requirements is that saying that the package that we’re using that we’re building right now will use the headers but we

    won’t propagate them no the the header is equal false in our requires mean that

    the current package doesn’t use the headers of the of the of

    the dependency I can give you I can give you an example let’s say that the mythical protovast package that contains

    the headers the libraries and it also contains the proto-c compiler okay so when you are depending on protob

    on the build context you put headers equal false and leaves equal false and run the

    runtime equal true because you are you only want to access the runtime in build context for that so that means hey I

    don’t want the headers of this package that’s what it means the transitive headers and the transitive leaves trades

    are those who defines how those things are propagating to the dumpster

    consumers okay also so there is a Transit header okay yeah yeah there’s a transitive headers yeah it was in one of

    the slides there is a transitive headers trade and there is a transitive ellipse straight okay that sounds like exactly well thank you

    thank you any other question

    so creating the new log file um I remember there’s some recipes where

    the requirements form and so on so how do you capture the

    full you know you know set yeah exactly that’s a very good question with lot files so I’ve said that now you only

    need one log file to capture all the possible configurations if you have variations and you have conditional

    requirements what you need to do is you need to do the Conan Conan install for

    as many configurations as you want to lock pass in the log file both as the input as an output and it will keep

    adding the new things that are conditional in some platforms and the

    cool thing is that even if it is at least it works for all configurations so so it works yeah yeah so you only

    need to call the different configurations one after the other the the other providing the log file both as

    input and some output and time is over thank you very much

    everyone [Applause]