A Simpler Way to Build Boost
Thursday December 21, 2017 09:32:55

Yesterday I wrote a post about using a lame Python script to rename the Boost binaries built by Bjam. Today's post is how such a script is completely unnecessary; you can just tell Boost to name the files differently.

I've been using the option –build-type=complete to build the Boost libraries for years. It basically tells it to build all of the different variations possible for a library and a given compiler, which helped me in my previous life when I used nothing but Boost build (so the tree was organized as “library/variants”). However in the modern Cenv era where I use prefix paths for each build variation which then contain libraries (so the tree is “variant/libraries”) there's no need to do this.

So instead when calling b2 pass in –layout=system which gets Boost to avoid all of it's squirrelly naming conventions, which works just fine since the resulting binary will be the only file for a given Boost library sitting in a cenv.

SO: the final install process looks like this:

cd boost-directories
bootstrap.bat
cenv set win64-debug
b2.exe --clean-all
b2.exe --stagedir="%CGET_PREFIX%" --toolset=msvc-14.1 address-model=64 debug link=shared --layout=system stage -j8 --with-system

First off, win64-debug is a cenv I created with cenv init win64-debug -DCMAKE_GENERATOR_PLATFORM:none=x64 -DCMAKE_BUILD_TYPE:none=Debug. The toolchain arguments for cenv init go to CMake and tell it to build for 64 bit (ideally it would tell it to force builds for debug, but it doesn't do that on MSVC++ for reasons I won't get into).

When I run b2.exe, this toolchain info is translated to bjam-ease, with CMAKE_GENERATOR_PLATFORM=x64 turning into address-model=64. So for any given cenv you'll probably need to Google the translation from CMake to Boost Build to make sure your builds agree.

Finally, –with-system tells Boost to build the “system” library. To see the possible libraries use b2 –show-libraries. It's also possible to specify multiple –with args on the command line.

So: voilĂ ! Assuming you mucked with Cmake to make it build the most recent version of Boost you should be good to go. This is about as simple as I've ever gotten the process of using the Boost libraries to be.

Note that the environment variable BOOST_ROOT still needs to be set; otherwise, we'll have to copy the boost headers into our cenv. That's as simple as copying the boost directory into the include directory of the cenv, but since it takes up 114 MB and I seem to be constantly running out of the paltry 256gb of disk space I have on the solid state drives I'm using I prefer not to.

Note: cget also has some interesting built in support for building and installing Boost, so you may want to look into using that. However it requires copying an entire distribution of all the Boost libraries to a temporary directory before it even does anything- which is involves an extra 524 MB!!- which may be a deal breaker if you're as perpetually low on disk space as I am.



Fixing CMake 3.10.1 to work with Boost 1.66
Wednesday December 20, 2017 09:32:55

I went to use the latest version of Boost yesterday only to find I couldn't make it work. “Hmm,” I thought, "didn't I just sacrifice hours of my life to do this very thing and write a blog post about it so I could remind myself later?"

After digging through CMake's included FindBoost.cmake file it turns out that Boost Build has changed behavior this release to put the architecture and address model into the library names it spits out. So the copious amount of code in FindBoost.cmake looks for a file named boost_coroutine-vc141-mt-gd-1_66 but doesn't find it because in Boost 1.66 that file is named boost_coroutine-vc141-mt-gd-x64-1_66 (the x64 is new) (the docs confirm this).

Second problem: CMake's Find Boost module (FindBoost.cmake) is oddly insistent on *not* defining import library targets for the Boost library if it doesn't know what version of Boost you're using. This is kind of a big deal as it means you can never use a new version of Boost correctly until CMake's authors figure out what is needed for that version of Boost and update FindBoost.cmake themselves.

As a hideous hack, this problem can be worked around as follows:

  1. (UPDATE: there's a way easier way to deal with this, see here for info.) Using a python script, copy all of the binary files (dlls and stuff) to a cenv, renaming them so they no longer contain x64. Here's the script, which must be run from the directory containing the Boost binaries and requires setting the environment variable CGET_PREFIX (which is done automatically for me by my tool Cenv):
    import os
    import shutil
    
    prefix = os.environ['CGET_PREFIX']
    
    for filename in os.listdir('.'):
        if '-x64' in filename:
            new_filename = filename.replace('-x64', '')
            new_path = os.path.join(prefix, 'lib', new_filename)
            print('{} -> {}'.format(filename, new_path))
            shutil.copyfile(filename, new_path)
    

  2. Edit the FindBoost.cmake file (on my machine, it's located at C:\Program Files\CMake\share\cmake-3.10\Modules\FindBoost.cmake) and change the following bit of code:
    if(NOT Boost_VERSION VERSION_LESS 106600)
      message(WARNING "New Boost version may have incorrect or missing dependencies and imported targets")
      set(_Boost_IMPORTED_TARGETS FALSE)
    endif()
    

    to
    if(NOT Boost_VERSION VERSION_LESS 106600)
      message(WARNING "New Boost version may have incorrect or missing dependencies and imported targets")
      # set(_Boost_IMPORTED_TARGETS FALSE)
    endif()
    

I'll admit I'm having a hard time understanding why the authors of CMake were so persnickety about only creating the import targets when they themselves had blessed a new Boost version; it would seem to encoruage people using bleeding edge versions of Boost to avoid the import targets in favor of the other variables the Find Boost module spits out, which doesn't seem to fit the spirit of modern CMake.

I'm sure they had their reasons, but I'd argue for users it makes sense to simply alter our own versions of CMake so it will behave as expected and avoid littering our own CMake scripts with workarounds that won't be necessary in a future release of CMake anyway.



Keeping Clean with Cenvs
Saturday December 9, 2017 09:32:55

When I work with other languages on large software projects, the workflow is typically:

  • Grab the source code, extract it, cd into that directory.
  • Run some standard build tool. Usually this tool is well known and completely accepted by the programming language's community (Maven, Tox, Cargo, etc), at least compared to C++, where every few years I hit a wall and have an existential crisis where contemplate how I'm building software and spend ages learning another tool.
  • See it create a pristine directory created to host all build artifacts.

I've been amazed how in C++ this last step is so different. Most tools, instead of creating a single directory with the output of the build process, instead pollute whatever directory they're currently in with zillions of object files and associated build artifacts. There's a historical reason for this: Make does things this way, Ninja was inspired by Make, CMake wants to work with all these tools so it has to follow suit. But it's really gross, and coming from other programming language cultures it feels extremely unintuitive.

In the same vein, “standard” package installs are pretty gross: instead of polluting the current directory with artifacts, they pollute your entire machine by affecting any software you build afterwards.

Typically, a package install works by invoking an install target, such as running “sudo make install”. This copies libraries, header files, and other stuff to system directories such as /usr/lib, /usr/include, /usr/bin, etc.

So if you're working on a project and want to pull in a dependency, like SDL2, you'd download SDL2, run sudo make install and then be able to use it from the SDL2 from your project without including SDL2's source inside of your own project or “vendoring” the dependency (a euphemism for shoving all the build artifacts into source control).

I've typically avoided installation processes like this because:

  • They require sudo.
  • Installing globally clearly doesn't scale if you plan to work on two projects which require different versions or build variants of the same dependency.
  • Because you install the package globally, afterwards it's easy to forget you had to cross this hurdle. If you're not constantly documenting things (which is very possible when you're spinning up a new project) you may not even remember that you had to do anything a year later when you're solving the mystery of “why does this not build correctly on the new guy's machine” (don't say CI will fix this; it's just as easy to bake this kind of dependency into a CI box or base image).
  • On Windows this procedure probably doesn't work at all, or uses some different standard the author of the library or build tool invented that installs things to unpredictable locations. Or worse, it uses the actual Windows standards.
  • Uninstalling the package is not possible because the installation process isn't very well tracked (unlike using a Windows MSI or a Debian package) so you have to guess what needs to be removed.

Thankfully, there's a way to avoid globally installing packages: prefix paths.

These are root directory paths that overrides the default “system” paths. So instead of files being copied to /usr/lib they go into ${prefix_path}/lib, ‘/usr/include’ goes to ‘${prefix_path}/include’ etc.

Since the idiom of C and C++ package installs is only an idiom and not enforced by a contract between build systems, the way you specify prefix paths differs between tools.

In CMake the standard is to set the variables CMAKE_INSTALL_PREFIX to tell it where to put packages, and CMAKE_PREFIX_PATH to tell it where to find them.

It's helpful for me to imagine each directory that can be used as a prefix path as it's own, semi-isolated environment for C and C++ dependencies. I call this a C-environment, or just cenv for short.

A cenv is isolated in that it can't be affected except by packages installed globally. Since cenvs don't affect each other, you can protect them from external influences by keeping your system clean and never installing packages globally.

Once you realize that a mechanism for cleaning installing libraries for C/C++ exists, it's easy to imagine how to achieve a nice work flow similar to other languages:

  • Create a new cenv.
  • Download, build and install whatever packages you want your project to depend on, such as the Boost headers or SDL2, to the cenv.
  • Build the project your working on using the cenv to pick up the packages installed earlier.

Unfortunately, installing packages is still a somewhat difficult process that entails checking out source code, generating build files in CMake, and installing it to your cenv.

Thankfully we can use a tool called cget to download and install CMake based projects.

In recent years there have been a series of package managers introduced for C++. What makes Cget different is how simple it is; most of these tools have introduced their own ideas about what it means to install a package, while cget instead went along with the CMake standards which itself was based on common idioms already in use in Makefile based projects.

The one area cget breaks from the norm is it doesn't install packages globally by default. Instead, cget's default is it creates a brand new cenv for you in the current directory by creating a directory named ./cget. It also creates a CMake toolchain file in this directory, which sets CMAKE_INSTALL_PREFIX and CMAKE_PREFIX_PATH to use the cenv.

(Note: cget calls this directory a “new prefix path”, but I think the name “cenv” represents it better.)

Using cget looks like this:

cd your-project-directory  # This contains a CMakeLists.txt which uses GLM
cget init  # creates a new cenv at `./cget` if none exists.
# Downloads the 0.9.8.5 release of glm from Github, creates a build directory
# somewhere inside of `./cget`, builds glm and installs it to locations in
# the new cenv such as `./cget/include`.
cget install g-truc/glm@0.9.8.5
mkdir build && cd build
# -DCMAKE_TOOLCHAIN_FILE tells it to use cget's cenv
cmake -DCMAKE_TOOLCHAIN_FILE=../cget/cget/cget.cmake -H../ -B./
cmake --build ./

The code above creates a cenv, installs the GLM library to it, then builds the CMake project in the directory by passing the toolchain file cget created for the cenv to Cmake.

(If you're curious about how CMake itself consumes GLM, somewhere in the CMakeLists.txt file will be “find_package(glm)” which will look in the cenv for package info on GLM. This blog post is already pretty long so I'll be explaining how this works in another one, but essentially if CMake knows where to look it can find libraries and header files that are installed in the typical way.)

The string you pass to cget install is called a “package source”. This can be the name of a project in GitHub (for example, above we fetched branch 0.9.8.5 of GLM), a file path on your local machine, a URL to a tar.gz file, or other more exotic types beyond the scope of this blog post.

cget can also accept as a package source a text file containing a list of package sources. By convention this file is called requirements.txt.

This means it's now possible to make a typical C++ Cmake project and distribute it with a file called requirements.txt in the root. Users can then install all the necessary packages they need with cget before building or installing the source code of your our package.

Since cget is based on existing Cmake and make idioms and standards, this also means if they're weirdos they don't have to use cget but can still get information on what packages our project needs.

If we created a requirements.txt file for our project above, we could fill it with:

g-truc/glm@0.9.8.5

With a requirements.txt file in the root of our project our new process becomes:

cd your-project-directory
cget init
cget install -f requirements.txt
mkdir build && cd build
# -DCMAKE_TOOLCHAIN_FILE tells it to use cget's cenv
cmake -DCMAKE_TOOLCHAIN_FILE=../cget/cget/cget.cmake -H../ -B./
cmake --build ./

If someone else wants to install our project using cget, cget will find the requirements.txt file and install the dependencies we require first.

If you want to build your project with different compilers or otherwise use multiple configurations, you'll need more than one cenv. It's possible to make cget create cenvs in different locations by passing –prefix to cget init. The environment variable CGET_PREFIX can also tell cget to use a cenv other than the directory cget in the current directory.

Additionally, we can pass arbitrary toolchains as well as certain CMake settings to cget init to cause it to include those toolchains from the cenv toolchain file it creates.

This means building for multiple configurations looks like this:

# Build with GCC 6 in debug mode
cget init --prefix gcc-debug -DCMAKE_C_COMPILER:none=gcc-6 -DCMAKE_CXX_COMPILER:none=g++-6 -DCMAKE_BUILD_TYPE:none=Debug
export CGET_PREFIX=$(pwd)/gcc-debug
mkdir build-gcc && cd build-gcc
cmake -DCMAKE_TOOLCHAIN_FILE=../gcc-debug/cget/cget.cmake -H../ -B./
cmake --build ./
# Now build with Clang in release mode
cd ..
cget init --prefix clang-release -DCMAKE_C_COMPILER:none=clang-3.8 -DCMAKE_CXX_COMPILER:none=clang++-3.8 -DCMAKE_BUILD_TYPE:none=Release
export CGET_PREFIX=$(pwd)/clang-release
mkdir build-clang && cd build-clang
cmake -DCMAKE_TOOLCHAIN_FILE=../clang-release/cget/cget.cmake -H../ -B./
cmake --build ./

Creating a cenv in the root of each project you're working on is probably fine for some people, but I discovered I quickly grew sheepish about creating brand new cenvs for all my little projects; I seem to always be on the verge of filling the solid state drives where I do all my work. Additionally certain packages- such as Boost- can take up a ton of space.

This is a very similar problem faced by Python developers who use virtualenv, which are like cenvs but for Python projects. For the purposes of testing and CI, a virtualenv for every project makes sense, but depending on how prolific you are this can get expensive. Tools like pyenv and virtualenvwrapper help by creating a list of virtualenvs that are available globally from a shell session that can be easily switched between.

I liked this workflow, so I did the same thing for cenvs by building a tool called, confusingly enough, Cenv (installing it is made to be simple even for those unfamilar with Python, and it also installs cget).

Cenv manages a group of cenvs stored at ~/.cenv (C:\Users\your-name\.cenv on Windows). You create and list them like this:

$ cenv init gcc-debug -DCMAKE_C_COMPILER:none=gcc-6 -DCMAKE_CXX_COMPILER:none=g++-6 -DCMAKE_BUILD_TYPE:none=Debug
$ cenv init clang-release -DCMAKE_C_COMPILER:none=clang-3.8 -DCMAKE_CXX_COMPILER:none=clang++-3.8 -DCMAKE_BUILD_TYPE:none=Release
$ cenv list
  gcc-debug
  clang-release

You can activate one of these cenvs by calling cenv set:

$ cenv set gcc-debug
* * using gcc-debug
$ cenv list
* gcc-debug
  clang-release

“activating” a cenv does three things:

  • It sets the CGET_PREFIX environment variable, so cget uses that cenv.
  • It adds the lib directory of the cenv to the PATH and LD_LIBRARY_PATH environment variables. This is necessary to run executables that have been linked to shared libraries or DLLs that were installed to the cenv (the alternative would be installing them globally or copying all of the needed shared libraries and DLLs to the same directory as the executable you're building, which is wasteful).

Cenv also wraps the cmake command so that it always passes in -DCMAKE_TOOLCHAIN_FILE=${CGET_PREFIX_PATH}/cget/cget.cmake, meaning you get to stop thinking about that.

Running cget set or cget deactivate undoes these changes (it also smartly removes entries added to the PATH and LD_LIBRARY_CONFIG).

With Cenv installed, building a project for two different configurations looks like this:

# Build with GCC 6 in debug mode
cenv set gcc-debug
mkdir build-gcc && cd build-gcc
cmake -H../ -B./
cmake --build ./
cd ..
cenv set clang-release
mkdir build-clang && cd build-clang
cmake -H../ -B./
cmake --build ./

It took me awhile to apprecaite cget and the standard CMake practices it was advocating for, mostly due to the fact that CMake itself, while being a useful, high quality tool, is loaded with so many options and settings that using it the right way isn't immediately clear, and often involves overly verbose arguments that made it feel like I was on the wrong path even when I wasn't.

However, at the core of it package installation with CMake is simple, and I'd argue the inner workings of it are easier to understand than current competing packaging tools for C++. Though cget is extremely useful, the source code is tiny as it focuses on solving a few tiny problems very well. It makes existing CMake practices easier to use instead of inventing it's own standards and procedures for installing C++ packages and furthering the babel of sorts the community is headed towards. As someone who has looked at most of the other C++ package managers I think cget's approach is ultimately the simplest and most maintainable.

I believe cget and Cenv collectively rub most of the rough edges off of CMake, leaving a workflow that is scalable and pleasant. You can install both today by following the instructions on Cenv's README.

In a future blog post, I hope to discuss the basics of writing CMake files which correctly install and consume packages from cenvs.



Building Boost 1.65.1 and OpenSSL on Windows
Sunday October 29, 2017 09:32:55

Watching Vinnie Falco's excellent CppCon 2017 talk “Make Classes Great Again! (Using Concepts for Customization Points” inspired me to try using Boost Beast, a library for working for with HTTP and WebSocket's in C++. Unfortunately since this is C++ I spent most of my time building all of the libraries I needed in Windows, which for some reason I still use. This post is a summary of my findings for my own future reference.

Building Boost

First off, I hope you're sitting down because I must reveal to you that Boost is still a total pain in the ass to build on Windows with the latest Visual Studio 2017 release.

In the past, I considered Boost Build my friend, but now we are enemies due to reasons I keep wanting to write a blog post about; in short I realized it was a time vampire that was draining me of my precious mortality. I liked a lot about Boost Build compared to the alternatives, but unfortunately it suffers from a lack of quality which caused me to spend more time maintaining it than working on my projects. In comparison CMake is rock solid once you get it working.

Everyone calls Microsoft's C++ compiler “Visual Studio” but the compiler itself is named “Microsoft Visual C++”. Boost Build bucks these social norms by referring to it as “msvc” which I always admired. With Visual Studio 2015, the corresponding compiler version was 14.0, but with Visual Studio 2017 it's changed to 14.1 which is surprising (of course since Boost Build is built on the worse than TCL stringly-typed programming language bjam you can pass in “msvc-15.0” as the tool chain and have no idea you've done something wrong until the build totally finishes, you get linker errors, and you finally realize the libraries were built with a different compiler).

Another thing that surprised me is that Boost Build doesn't do a good job of automatically finding the Microsoft tools it needs to call, meaning you have to run the “vcvars” batch file included with Visual Studio or open up the shortcut labeled “Visual Studio 2017's Developer Command Prompt”. In the past this wasn't necessary, so I never did it. I also liked keeping my shell clean from all the strange paths MS added to it. However since both Boost and OpenSSL required me to do this I've since added CALL “C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Auxiliary\Build\vcvars64.bat” to my bash.rc-like batch file that executes for my go-to Command Prompt session. Time will tell if this creates other problems. Side note: I don't bother using mingw or any other compiler on Windows anymore, since Linux let's me do that without the stress ulcer and is run on my laptop natively and on my desktop using the Windows Subsystem for Linx (aka Bash for Ubuntu for Windows). Part of the fall out of my divorce with Boost Build also means using multiple compilers is even harder than it was before in Windows and there's no point when easier operating systems exist and most of my code is portable.

Another big change to my go-to shell was adding “C:\Program Files\7-Zip”, which gives me access to the excellent 7-zip tool from the command line (AppVeyor makes this available on it's command line which is a sign it's a good idea).

Finally, it's worth noting I have long put all of the typical unix tools (grep, sed, ls, etc) on my command prompt by installing Cygwin (which I used to need for Dreamcast development) and adding it's bin directory to my path. In the script below I use curl and also sha256sum pretty often.

Next, I realized I had an ancient site-config.jam file with all my toolchains set up (one of Boost Build's best features). Unfortunately I also had a statement in there that said “using msvc : 14.0 ;”. In the end I commented all of this out.

Manuel Gustavo has written a series of scripts for building Boost versions 1.63 and 1.64 in Windows. I didn't use them other than to crib the commands he used to build the 64-bit variants of the libraries.

First, make sure you have this stuff in your command prompt:

set PATH=%PATH%;C:\Program Files\7-Zip
set PATH=%PATH%;C:\Cygwin\bin\curl.exe
CALL "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Auxiliary\Build\vcvars64.bat"

Then run this:

cd C:\Tools\boost
curl -o boost_1_65_1.7z -L https://dl.bintray.com/boostorg/release/1.65.1/source/boost_1_65_1.7z
REM should be "df7bea4ce58076bd8e9cd63966562d8cf969b431e1c587969e6aa408e51de606"
sha256sum boost_1_65_1.7z
7z x boost_1_65_1.7z
cd boost_1_65_1
bootstrap.bat
b2.exe --toolset=msvc-14.1 --clean-all
b2.exe --toolset=msvc-14.1 architecture=x86 address-model=64 --stagedir=".\stage_x64" threading=multi --build-type=complete stage -j8
b2.exe --toolset=msvc-14.1 --clean-all
b2.exe --toolset=msvc-14.1 architecture=x86 address-model=32 --stagedir=".\stage_x86" threading=multi --build-type=complete stage -j8

Notes: change ‘-j8’ above to ‘-jn’, where ‘n’ is the number of cores on your machine.

Manuel Gustavo has more code that places the libraries into a “bin” directory, but I've skipped that.

Once this runs, you need to set the environment variable BOOST_ROOT to the location of the unzipped boost_1_65_1 stuff from above.

Building OpenSSL

If you want to make websites or service, you're going to want to use SSL. It's a huge headache but given how much the world relies on it you can't really avoid being initiated.

OpenSSL turns out to not be too difficult to build if you follow the exact necessary steps, much like every other problem faced by man. My main issues were caused by the myriad of out-dated documentation that's out there.

First off, building OpenSSL in Windows requires Perl. I used Strawberry Perl, because I'm a fan of any any programming tool or language named after food. It may feel annoying to have to install a language just to build a thing but cheer up by imagining all the totally siiiiick Perl 5 code you'll be able to write after this.

Also, there's some difference between OpenSSL distributions. 1.1.0f is the newest version, but turns out the 1.1.0 version doesn't work with Boost; see bugs like this for more info. So if you value your time on Earth stick with 1.0.2.

Also, building with the ASM stuff is probably better but adds the need to download NASM. So I just skipped it with the “no-asm” option.

mkdir C:\Tools\openssl-64
mkdir C:\Tools\openssl-src
cd C:\Tools\openssl-src
curl -o openssl-1.0.2l.tar.gz -L https://www.openssl.org/source/openssl-1.0.2l.tar.gz
7z x openssl-1.0.2l.tar.gz
7z x openssl-1.0.2l.tar
cd openssl-1.0.2l
perl Configure VC-WIN64A no-asm --prefix=C:\Tools\openssl-64 --openssldir=C:\Tools\openssl-64
ms\do_win64a
nmake -f ms\nt.mak
nmake -f ms\nt.mak install

Building some Code

To start using Boost and OpenSSL from CMake you'll need to set some environment variables so CMake knows where you put that stuff.

set BOOST_LIBRARYDIR=C:\Tools\boost\boost_1_65_1\stage_x64\lib
set OPENSSL_ROOT_DIR=C:\Tools\openssl-64
SET PATH=%PATH%;%BOOST_LIBRARYDIR%

At this point you should be able to create simple CMake projects that can find and use OpenSSL and Boost (including stuff like coroutines).

As painful as it is, the result is worth it. Here is what some asynchronous client code looks like using the Boost Beast library (it's header only so no nasty build step required).

boost::asio::spawn(ios, [&](boost::asio::yield_context yield)
    {
        tcp::resolver resolver{ios};
        ssl::stream<tcp::socket> stream{ios, ctx};

        {
            auto const begin = resolver.async_resolve(
                tcp::resolver::query{ host, port }, yield);
            tcp::resolver::iterator end;
            boost::asio::async_connect(
                stream.next_layer(), begin, end, yield);
        }

        stream.async_handshake(ssl::stream_base::client, yield);

        http::request<http::string_body> req{http::verb::get, target, version};
        req.set(http::field::host, host);
        req.set(http::field::user_agent, BOOST_BEAST_VERSION_STRING);

        http::async_write(stream, req, yield);

        boost::beast::flat_buffer b;

        http::response<http::dynamic_body> res;

        http::async_read(stream, b, res, yield);

        std::cout << res << std::endl;

        boost::system::error_code ec;
        // This is currently failing every single time due to this bug:
        //  https://svn.boost.org/trac10/ticket/12710
        stream.async_shutdown(yield[ec]);
        if(ec == boost::asio::error::eof) {
            ec.assign(0, ec.category());
        } else {
            boost::asio::detail::throw_error(ec);
        }
    });

(Note: I stole this from Vinnie Falco's example code, but I put the coroutine function into a lambda and rely on exceptions rather than error checking to make the code look nicer :)).

I think it's further proof that coroutines are the sexiest language feature ever- everyplace you see “yield” getting passed in would normally require a call to a separate function in classic Boost ASIO style, but here the coroutine yields, bringing the same advantages but without the fuss.





<---2020-06-07 09:32:55 history 2015-04-25 09:32:55--->



-All material © 2007 Tim Simpson unless otherwise noted-