<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>libre &amp;mdash; Janet&#39;s Shenanigans</title>
    <link>https://blog.blackquill.cc/tag:libre</link>
    <description></description>
    <pubDate>Thu, 30 Apr 2026 13:37:38 +0000</pubDate>
    <item>
      <title>New tools in the QML LSP collection: qml-dap, qml-dbg, and qml-lint</title>
      <link>https://blog.blackquill.cc/new-tools-in-the-qml-lsp-collection-qml-dap-qml-dbg-and-qml-lint</link>
      <description>&lt;![CDATA[The source for all the tools mentioned in this blog post is available here&#xA;&#xA;A screenshot of QML DAP providing debugger functionality for QML in VSCode&#xA;&#xA;qml-dap: QML debugger for editors&#xA;&#xA;While working on qml-lsp, I took a tangent to write a DAP implementation for QML. This ended up being a very long tangent, but it&#39;s worth it: being able to debug QML without needing Qt Creator available. The DAP protocol is the debugger equivalent to LSP: it&#39;s a cross-editor and cross-language protocol that allows debuggers to implement DAP and get support for a bunch of editors, and allows editors to implement DAP and get support for a bunch of debuggers.&#xA;&#xA;In short, this means that you can use qml-dap in combination with a DAP-supporting editor of your choice, and gain access to a debugger.&#xA;&#xA;Note that QML DAP doesn&#39;t support the entirety of DAP, but supports enough to serve as an improvement over print debugging. More of the protocol will be covered as I improve qml-dap.&#xA;&#xA;qml-dbg: QML debugger for the terminal&#xA;&#xA;While writing the code necessary for debugging QML programs for qml-dap, I realised that QML had no command line debugger. So, I wrote one, and called it qml-dbg.&#xA;&#xA;The output of a sample qml-dbg session is provided here for your browsing:&#xA;&#xA;❯ qml-dbg&#xA;Hi, welcome to qml-dbg! Type &#34;help&#34; if you want me to explain how you use me.&#xA;  attach localhost:5050&#xA;Connecting to localhost:5050...&#xA;I connected to the program! Now, you can start debugging it.&#xA;  b a.qml:13&#xA;I set breakpoint #0&#xA;The program will pause right before it starts to run that line of code&#xA;You can disable it with &#39;breakpoint-disable 0&#39;&#xA;See active breakpoints with &#39;breakpoints&#39;&#xA;  breakpoints&#xA;Breakpoints:&#xA;        #0 at a.qml:13&#xA;  breakpoint-disable 0&#xA;Disabled breakpoint 0&#xA;  breakpoints&#xA;Breakpoints:&#xA;        (disabled) #0 at a.qml:13&#xA;  breakpoint-enable 0&#xA;Enabled breakpoint 0&#xA;The program paused!&#xA;Run &#39;backtrace&#39; to see a stack trace.&#xA;    bt&#xA;Most recently called function:&#xA;        onClicked in a.qml:13 (file:///home/jblackquill/Scratch/a.qml)&#xA;  eval btn1.text = &#34;hello debugger!&#34;&#xA;&#34;hello debugger!&#34;&#xA;  continue&#xA;  ## qml-lint: standalone linting tool&#xA;&#xA;If you really liked qml-lsp&#39;s lints, then you can now have them standalone on the command-line.&#xA;&#xA;❯ qml-lint a.qml &#xA;12:3 - 12:6     a.qml   Don&#39;t use var in modern JavaScript. Consider using &#34;const&#34; here instead. (var lint)&#xA;&#xA;        var a = 5&#xA;&#xA;15:2 - 15:14    a.qml   Don&#39;t use anchors.fill in a RowLayout. Instead, consider using &#34;Layout.fillWidth: true&#34; and &#34;Layout.fillHeight: true&#34; (anchors in layouts lint)&#xA;&#xA;        anchors.fill: parent&#xA;&#xA;Tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<p><a href="https://invent.kde.org/cblack/qew-em-el-el-ess-pee/-/starrers" rel="nofollow">The source for all the tools mentioned in this blog post is available here</a></p>

<p><img src="https://img.blackquill.cc/qml-dap-going.png" alt="A screenshot of QML DAP providing debugger functionality for QML in VSCode"></p>

<h2 id="qml-dap-qml-debugger-for-editors">qml-dap: QML debugger for editors</h2>

<p>While working on qml-lsp, I took a tangent to write a DAP implementation for QML. This ended up being a very long tangent, but it&#39;s worth it: being able to debug QML without needing Qt Creator available. The DAP protocol is the debugger equivalent to LSP: it&#39;s a cross-editor and cross-language protocol that allows debuggers to implement DAP and get support for a bunch of editors, and allows editors to implement DAP and get support for a bunch of debuggers.</p>

<p>In short, this means that you can use qml-dap in combination with a DAP-supporting editor of your choice, and gain access to a debugger.</p>

<p>Note that QML DAP doesn&#39;t support the entirety of DAP, but supports enough to serve as an improvement over print debugging. More of the protocol will be covered as I improve qml-dap.</p>

<h2 id="qml-dbg-qml-debugger-for-the-terminal">qml-dbg: QML debugger for the terminal</h2>

<p>While writing the code necessary for debugging QML programs for qml-dap, I realised that QML had no command line debugger. So, I wrote one, and called it qml-dbg.</p>

<p>The output of a sample qml-dbg session is provided here for your browsing:</p>

<pre><code>❯ qml-dbg
Hi, welcome to qml-dbg! Type &#34;help&#34; if you want me to explain how you use me.
&gt; attach localhost:5050
Connecting to localhost:5050...
I connected to the program! Now, you can start debugging it.
&gt; b a.qml:13
I set breakpoint #0
The program will pause right before it starts to run that line of code
You can disable it with &#39;breakpoint-disable 0&#39;
See active breakpoints with &#39;breakpoints&#39;
&gt; breakpoints
Breakpoints:
        #0 at a.qml:13
&gt; breakpoint-disable 0
Disabled breakpoint 0
&gt; breakpoints
Breakpoints:
        (disabled) #0 at a.qml:13
&gt; breakpoint-enable 0
Enabled breakpoint 0
The program paused!
Run &#39;backtrace&#39; to see a stack trace.
&gt; 
&gt; bt
Most recently called function:
        onClicked in a.qml:13 (file:///home/jblackquill/Scratch/a.qml)
&gt; eval btn1.text = &#34;hello debugger!&#34;
&#34;hello debugger!&#34;
&gt; continue
&gt; 
</code></pre>

<h2 id="qml-lint-standalone-linting-tool">qml-lint: standalone linting tool</h2>

<p>If you really liked qml-lsp&#39;s lints, then you can now have them standalone on the command-line.</p>

<pre><code>❯ qml-lint a.qml 
12:3 - 12:6     a.qml   Don&#39;t use var in modern JavaScript. Consider using &#34;const&#34; here instead. (var lint)

        var a = 5

15:2 - 15:14    a.qml   Don&#39;t use anchors.fill in a RowLayout. Instead, consider using &#34;Layout.fillWidth: true&#34; and &#34;Layout.fillHeight: true&#34; (anchors in layouts lint)

        anchors.fill: parent
</code></pre>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/new-tools-in-the-qml-lsp-collection-qml-dap-qml-dbg-and-qml-lint</guid>
      <pubDate>Sat, 05 Mar 2022 05:13:16 +0000</pubDate>
    </item>
    <item>
      <title>qml-doxygen: qml-lsp&#39;s qml -  doxygen cousin</title>
      <link>https://blog.blackquill.cc/qml-doxygen-qml-lsps-qml-doxygen-cousin</link>
      <description>&lt;![CDATA[The Cool Now&#xA;&#xA;With the infrastructure I built in qml-lsp for parsing and analysing QML files, I thought &#34;hm, since doxyqml is just a glorified qml parser -  c++ header file converter, wouldn&#39;t it be trivial to write the same thing in go reusing qml-lsp&#39;s infrastructure?&#34; And that&#39;s exactly what I did. I wrote a 130-line program that faithfully replicated doxyqml&#39;s functionality in Go.&#xA;&#xA;By virtue of being a Go program that calls on a pretty optimised parser in C, it ended up being a little over 10 times faster than doxyqml on my system.&#xA;&#xA;I wasn&#39;t done there.&#xA;&#xA;I thought &#34;hmm, couldn&#39;t I reuse the semantic analysis I did for qml-lsp to improve the output a bit?&#34;&#xA;&#xA;So, that&#39;s pretty much what I did.&#xA;&#xA;Currently, the most notable improvement over doxyqml is in producing a better superclass for the output:&#xA;&#xA;doxyqml, with aliased import (import foo as bar):&#xA;class Avatar : public QtQuick.Controls.Control {&#xA;&#xA;doxyqml, without aliased import:&#xA;class Avatar : Control {&#xA;&#xA;One isn&#39;t valid C++ (I&#39;m surprised Doxygen takes it at all), and the other fails to specifically name where Control comes from, leading to issues with Doxygen trying to locate the superclass.&#xA;&#xA;qml-doxygen reuses the semantic analysis from qml-lsp to generate the following output, whether the import is aliased or not:&#xA;class Avatar : public QtQuick::Controls::Control {&#xA;&#xA;It&#39;s both valid C++, and tells Doxygen exactly where the name is coming from.&#xA;&#xA;The Roadmap&#xA;&#xA;The next thing I&#39;m planning to do is to resolve the concrete type of an alias property, so that documentation generation for aliases can be improved without developers needing to explicitly tell the computer what type the alias points to.&#xA;&#xA;I may also add the ability to &#34;splat&#34; grouped properties with a special sigil, so that something like readonly property AvatarGroup actions: AvatarGroup { } can be expanded into the properties of the AvatarGroup by qml-doxygen, resulting in better documentation.&#xA;&#xA;Tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<h2 id="the-cool-now">The Cool Now</h2>

<p>With the infrastructure I built in qml-lsp for parsing and analysing QML files, I thought “hm, since doxyqml is just a glorified qml parser –&gt; c++ header file converter, wouldn&#39;t it be trivial to write the same thing in go reusing qml-lsp&#39;s infrastructure?” And that&#39;s exactly what I did. I wrote a 130-line program that faithfully replicated doxyqml&#39;s functionality in Go.</p>

<p>By virtue of being a Go program that calls on a pretty optimised parser in C, it ended up being a little over 10 times faster than doxyqml on my system.</p>

<p>I wasn&#39;t done there.</p>

<p>I thought “hmm, couldn&#39;t I reuse the semantic analysis I did for qml-lsp to improve the output a bit?”</p>

<p>So, that&#39;s pretty much what I did.</p>

<p>Currently, the most notable improvement over doxyqml is in producing a better superclass for the output:</p>

<p><code>doxyqml</code>, with aliased import (<code>import foo as bar</code>):</p>

<pre><code>class Avatar : public QtQuick.Controls.Control {
</code></pre>

<p><code>doxyqml</code>, without aliased import:</p>

<pre><code class="language-c++">class Avatar : Control {
</code></pre>

<p>One isn&#39;t valid C++ (I&#39;m surprised Doxygen takes it at all), and the other fails to specifically name where <code>Control</code> comes from, leading to issues with Doxygen trying to locate the superclass.</p>

<p><code>qml-doxygen</code> reuses the semantic analysis from <code>qml-lsp</code> to generate the following output, whether the import is aliased or not:</p>

<pre><code class="language-c++">class Avatar : public QtQuick::Controls::Control {
</code></pre>

<p>It&#39;s both valid C++, and tells Doxygen exactly where the name is coming from.</p>

<h2 id="the-roadmap">The Roadmap</h2>

<p>The next thing I&#39;m planning to do is to resolve the concrete type of an <code>alias</code> property, so that documentation generation for aliases can be improved without developers needing to explicitly tell the computer what type the alias points to.</p>

<p>I may also add the ability to “splat” grouped properties with a special sigil, so that something like <code>readonly property AvatarGroup actions: AvatarGroup { }</code> can be expanded into the properties of the <code>AvatarGroup</code> by <code>qml-doxygen</code>, resulting in better documentation.</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/qml-doxygen-qml-lsps-qml-doxygen-cousin</guid>
      <pubDate>Sun, 09 Jan 2022 07:03:23 +0000</pubDate>
    </item>
    <item>
      <title>Janet&#39;s Mid-December Shenanigans</title>
      <link>https://blog.blackquill.cc/janets-shenanigans</link>
      <description>&lt;![CDATA[qml-lsp&#xA;&#xA;I worked on fancying up qml-lsp&#39;s underlying infrastructure and added some new features. One of the more noticeable things is that attached properties can complete now:&#xA;&#xA;screenshot of completing an attached property&#xA;&#xA;There&#39;s also some lints:&#xA;&#xA;screenshot of linting warning about an unused import&#xA;&#xA;Currently there&#39;s some lints warning against usage of language features that negatively impair readability, (with statements, alias, etc.), warning about using anchors in a Layout, and unused imports.&#xA;&#xA;Shades&#xA;&#xA;Shades is a small utility app I made that lets you grab shades of a colour.&#xA;&#xA;screenshot of shades&#xA;&#xA;It uses the OKLAB colour space to generate shades, which means it produces really good variations on a colour that only differ in visual brightness instead of going &#34;off-brand&#34; like other colour spaces would have.&#xA;&#xA;Tok&#xA;&#xA;Yeah, Tok just has the &#34;saved messages&#34; room display as such instead of using your account&#39;s name and avatar. Not much, which is why it&#39;s getting lumped in here instead of in its own blog post.&#xA;&#xA;Tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<h2 id="qml-lsp-https-invent-kde-org-cblack-qew-em-el-el-ess-pee"><a href="https://invent.kde.org/cblack/qew-em-el-el-ess-pee" rel="nofollow">qml-lsp</a></h2>

<p>I worked on fancying up qml-lsp&#39;s underlying infrastructure and added some new features. One of the more noticeable things is that attached properties can complete now:</p>

<p><img src="https://img.blackquill.cc/mald%20layout.png" alt="screenshot of completing an attached property"></p>

<p>There&#39;s also some lints:</p>

<p><img src="https://img.blackquill.cc/unused%20import.png" alt="screenshot of linting warning about an unused import"></p>

<p>Currently there&#39;s some lints warning against usage of language features that negatively impair readability, (<code>with</code> statements, <code>alias</code>, etc.), warning about using anchors in a Layout, and unused imports.</p>

<h2 id="shades">Shades</h2>

<p>Shades is a small utility app I made that lets you grab shades of a colour.</p>

<p><img src="https://img.blackquill.cc/shades.png" alt="screenshot of shades"></p>

<p>It uses the OKLAB colour space to generate shades, which means it produces really good variations on a colour that only differ in visual brightness instead of going “off-brand” like other colour spaces would have.</p>

<h2 id="tok">Tok</h2>

<p>Yeah, Tok just has the “saved messages” room display as such instead of using your account&#39;s name and avatar. Not much, which is why it&#39;s getting lumped in here instead of in its own blog post.</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/janets-shenanigans</guid>
      <pubDate>Sat, 18 Dec 2021 19:33:16 +0000</pubDate>
    </item>
    <item>
      <title>Porting Tok To Not Linux: The Journey Of Incessant Failure</title>
      <link>https://blog.blackquill.cc/porting-tok-to-not-linux-the-journey-of-incessant-failure</link>
      <description>&lt;![CDATA[This response from a reader pretty much sums it up:&#xA;c++ package manager failure&#xA;&#xA;Stop One: Will Craft Work?&#xA;&#xA;Craft seems like the no-brainer for a KDE project to use, considering it&#39;s in-house and supports all of KDE&#39;s frameworks and has packaging capabilities. Unfortunately, what sounds good on paper does not translate to what&#39;s good in execution.&#xA;&#xA;When I first checked it out, the current version being shipped had incorrect code that would have failed to compile had it been a compiled language instead of Python. Heck, even running a typechecker for Python would have revealed the fact that the code was trying to call a function that didn&#39;t exist. Yet, this managed to get shipped. Not a good first impression.&#xA;&#xA;After manually patching in the function into existence on my system, I ran into another hurdle: Craft&#39;s env script is broken; mangling PATH to an extent where entries like /usr/bin and /bin and other things got just truncated into oblivion, resulting in a shell where you couldn&#39;t do much of anything.&#xA;&#xA;After manually patching PATH to be not mangled, I ran into another and the final hurdle before I gave up: Craft tried to use half bundled libraries and tools and half system libraries and tools, resulting in dynamic linker errors from system tools not finding symbols they needed from bundled libraries.&#xA;&#xA;When I brought these issues up in the Craft chat, the answers basically amounted to a lack of care and &#34;go use Ubuntu.&#34; Not acceptable for Tok considering most of the people interested in building Tok like this don&#39;t use Ubuntu, and honestly doesn&#39;t make you have much faith in a system for porting utilities to other platforms if said system doesn&#39;t even work across the distributions of one platform.&#xA;&#xA;Stop Two: Maybe Conan?&#xA;&#xA;Conan seems like the second-in-line no-brainer for Tok to use. It&#39;s the largest C++ package manager, even supporting Qbs. Of course, like with Craft, what sounds good on paper doesn&#39;t match up to execution.&#xA;&#xA;Out of the gate, I looked at the Qt package, only to find that there was one (1) Qt package for it consisting of the entirety of Qt, WebEngne and all. Kinda oof, but not a deal breaker. Well, it wouldn&#39;t be a dealbreaker if Conan had prebuilt Qt packages for&#xA;Settings: arch=x8664, buildtype=Release, compiler=gcc, compiler.version=11, os=Linux&#xA;&#xA;But it doesn&#39;t. I&#39;m not going to build an entire web engine just for an attempt at maybe getting a non-Linux build of Tok, and having to build a web engine as part of Tok&#39;s CI is a no-go in terms of disk, memory and CPU.&#xA;&#xA;Stop Three: vcpkg&#xA;&#xA;Considering Microsoft has been a developer tools company for about as thrice as long as I&#39;ve been alive, I hope their take at the C++ package manager thing is worth their salt.&#xA;&#xA;Some weirdness ensued with the VCPKG_ROOT environment variable at first, but it was easy to fix by pointing it at the vcpkg repo.&#xA;&#xA;While doing the vcpkg install, I found the output somewhat hard to follow, so I had no idea how far along it was. I just let it sit since it seemed to be making progress.&#xA;&#xA;While vcpkg didn&#39;t have prebuilt binaries for my setup, it didn&#39;t require building all Qt modules like Conan did, so the ask was much more reasonable.&#xA;&#xA;And then I noticed a big issue: vcpkg has absolutely zero versioning, other than the git repository with all the package manifests. This essentially means that in order to build with Qt5, I need to commit to an ancient version of vcpkg packages and stay there indefinitely. I also have to ask users to roll back their vcpkg install that far to build Tok. Not really acceptable as an ask for people who might want to build Tok for not Linux.&#xA;&#xA;Stop Four: Wait, It&#39;s Already Here (At Least For Macs)&#xA;&#xA;Turns out Nix, the thing that Tok already supports building with, also supports macOS. Well, that was easy. While it doesn&#39;t spit out a premade .app like other porting systems can do, it does ensure a working build with available dependencies, which is already most of the way there.&#xA;&#xA;Conclusion: Apples And Penguins Rule, Everyone Else Drools&#xA;&#xA;Cross-platform C++ packaging and distribution is still a very unsolved problem, unlike other languages/frameworks like Go, Electron, Rust, Zig, etc. as I learned painfully through these escapades. Nix seems the most promising on this front, as it offers a very consistent environment across platforms, which gets you most of the way there in terms of building. It doesn&#39;t support Windows (yet?), but simply being able to use a functional compiler instead of Apple Clang is already a killer feature for using it to port apps to macOS.&#xA;&#xA;Qbs is also a huge help in terms of the porting process, as it natively supports building things that require auxiliary scripts or build system hacks with other build systems, like .app bundles, Windows installers, and multi-architecture .app/.aab/.apks with just a few or no extra lines in the build system.&#xA;&#xA;For Tok on macOS, all I need to do is add these two lines to the build script in order to get a usable .app file from it:&#xA;&#xA;Depends { name: &#34;bundle&#34; }&#xA;bundle.isBundle: true&#xA;&#xA;While it lacks a lot of metadata that you need to fill in yourself, it&#39;s again, another 99% of the way there solution where the remaining 1% is mostly just a little data or boilerplate or running a tool.&#xA;&#xA;I still haven&#39;t figured out what I&#39;ll be doing for Windows, but the need for an end-user Windows package is still a long ways off, considering Tok is still nowhere near a 1.0 release status. Perhaps I can make leverage of Fedora&#39;s mingw packages or check out https://mxe.cc/, or maybe just install dependencies from source to a Windows system without a package manager, and bundle them during the build process. If you have any suggestions, do feel free to hop on by in the Tok chat and drop them.&#xA;&#xA;Tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<p>This response from a reader pretty much sums it up:
<img src="https://img.blackquill.cc/Screenshot%202021-11-21%20at%2014-40-47%20Discord%20-%20A%20New%20Way%20to%20Chat%20with%20Friends%20Communities.png"></p>

<h2 id="stop-one-will-craft-work">Stop One: Will Craft Work?</h2>

<p>Craft seems like the no-brainer for a KDE project to use, considering it&#39;s in-house and supports all of KDE&#39;s frameworks and has packaging capabilities. Unfortunately, what sounds good on paper does not translate to what&#39;s good in execution.</p>

<p>When I first checked it out, the current version being shipped had incorrect code that would have failed to compile had it been a compiled language instead of Python. Heck, even running a typechecker for Python would have revealed the fact that the code was trying to call a function that didn&#39;t exist. Yet, this managed to get shipped. Not a good first impression.</p>

<p>After manually patching in the function into existence on my system, I ran into another hurdle: Craft&#39;s env script is broken; mangling PATH to an extent where entries like /usr/bin and /bin and other things got just truncated into oblivion, resulting in a shell where you couldn&#39;t do much of anything.</p>

<p>After manually patching PATH to be not mangled, I ran into another and the final hurdle before I gave up: Craft tried to use half bundled libraries and tools and half system libraries and tools, resulting in dynamic linker errors from system tools not finding symbols they needed from bundled libraries.</p>

<p>When I brought these issues up in the Craft chat, the answers basically amounted to a lack of care and “go use Ubuntu.” Not acceptable for Tok considering most of the people interested in building Tok like this don&#39;t use Ubuntu, and honestly doesn&#39;t make you have much faith in a system for porting utilities to other platforms if said system doesn&#39;t even work across the distributions of one platform.</p>

<h2 id="stop-two-maybe-conan">Stop Two: Maybe Conan?</h2>

<p>Conan seems like the second-in-line no-brainer for Tok to use. It&#39;s the largest C++ package manager, even supporting Qbs. Of course, like with Craft, what sounds good on paper doesn&#39;t match up to execution.</p>

<p>Out of the gate, I looked at the Qt package, only to find that there was one (1) Qt package for it consisting of the entirety of Qt, WebEngne and all. Kinda oof, but not a deal breaker. Well, it wouldn&#39;t be a dealbreaker if Conan had prebuilt Qt packages for</p>

<pre><code>- Settings: arch=x86_64, build_type=Release, compiler=gcc, compiler.version=11, os=Linux
</code></pre>

<p>But it doesn&#39;t. I&#39;m not going to build an entire web engine just for an attempt at <em>maybe</em> getting a non-Linux build of Tok, and having to build a web engine as part of Tok&#39;s CI is a no-go in terms of disk, memory and CPU.</p>

<h2 id="stop-three-vcpkg">Stop Three: vcpkg</h2>

<p>Considering Microsoft has been a developer tools company for about as thrice as long as I&#39;ve been alive, I hope their take at the C++ package manager thing is worth their salt.</p>

<p>Some weirdness ensued with the VCPKG_ROOT environment variable at first, but it was easy to fix by pointing it at the vcpkg repo.</p>

<p>While doing the vcpkg install, I found the output somewhat hard to follow, so I had no idea how far along it was. I just let it sit since it seemed to be making progress.</p>

<p>While vcpkg didn&#39;t have prebuilt binaries for my setup, it didn&#39;t require building all Qt modules like Conan did, so the ask was much more reasonable.</p>

<p>And then I noticed a big issue: vcpkg has absolutely zero versioning, other than the git repository with all the package manifests. This essentially means that in order to build with Qt5, I need to commit to an ancient version of vcpkg packages and stay there indefinitely. I also have to ask users to roll back their vcpkg install that far to build Tok. Not really acceptable as an ask for people who might want to build Tok for not Linux.</p>

<h2 id="stop-four-wait-it-s-already-here-at-least-for-macs">Stop Four: Wait, It&#39;s Already Here (At Least For Macs)</h2>

<p>Turns out Nix, the thing that Tok already supports building with, also supports macOS. Well, that was easy. While it doesn&#39;t spit out a premade .app like other porting systems can do, it does ensure a working build with available dependencies, which is already most of the way there.</p>

<h2 id="conclusion-apples-and-penguins-rule-everyone-else-drools">Conclusion: Apples And Penguins Rule, Everyone Else Drools</h2>

<p>Cross-platform C++ packaging and distribution is still a very unsolved problem, unlike other languages/frameworks like Go, Electron, Rust, Zig, etc. as I learned painfully through these escapades. Nix seems the most promising on this front, as it offers a very consistent environment across platforms, which gets you most of the way there in terms of building. It doesn&#39;t support Windows (yet?), but simply being able to use a functional compiler instead of Apple Clang is already a killer feature for using it to port apps to macOS.</p>

<p>Qbs is also a huge help in terms of the porting process, as it natively supports building things that require auxiliary scripts or build system hacks with other build systems, like <code>.app</code> bundles, Windows installers, and multi-architecture <code>.app</code>/<code>.aab</code>/<code>.apk</code>s with just a few or no extra lines in the build system.</p>

<p>For Tok on macOS, all I need to do is add these two lines to the build script in order to get a usable <code>.app</code> file from it:</p>

<pre><code class="language-qml">Depends { name: &#34;bundle&#34; }
bundle.isBundle: true
</code></pre>

<p>While it lacks a lot of metadata that you need to fill in yourself, it&#39;s again, another 99% of the way there solution where the remaining 1% is mostly just a little data or boilerplate or running a tool.</p>

<p>I still haven&#39;t figured out what I&#39;ll be doing for Windows, but the need for an end-user Windows package is still a long ways off, considering Tok is still nowhere near a 1.0 release status. Perhaps I can make leverage of Fedora&#39;s mingw packages or check out <a href="https://mxe.cc/" rel="nofollow">https://mxe.cc/</a>, or maybe just install dependencies from source to a Windows system without a package manager, and bundle them during the build process. If you have any suggestions, do feel free to hop on by in the <a href="https://t.me/kdetok" rel="nofollow">Tok chat</a> and drop them.</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/porting-tok-to-not-linux-the-journey-of-incessant-failure</guid>
      <pubDate>Sun, 21 Nov 2021 19:32:01 +0000</pubDate>
    </item>
    <item>
      <title>Introducing hRPC: a simple RPC system for user-facing APIs</title>
      <link>https://blog.blackquill.cc/introducing-hrpc-a-simple-rpc-system-for-user-facing-apis</link>
      <description>&lt;![CDATA[  This is the sequel post to the previous post on hRPC; after hRPC has matured and gotten an actual spec written for it. This post contains much more information about hRPC, as well as tutorials for it.&#xA;&#xA;  This post is mirrored from our post on dev.to&#xA;&#xA;Co-authored by: Yusdacra (yusdacra@GitHub), Bluskript (Bluskript@GitHub), Pontaoski (pontaoski@GitHub)&#xA;&#xA;hRPC is a new RPC system that we, at Harmony, have been developing and using for our decentralized chat protocol. It uses Protocol Buffers (Protobufs) as a wire format, and supports streaming. &#xA;&#xA;hRPC is primarily made for user-facing APIs and aims to be as simple to use as possible.&#xA;&#xA;If you would like to learn more, the hRPC specification can be found here.&#xA;&#xA;  What is an RPC system?&#xA;  If you know traditional API models like REST, then you can think of RPC as a more integrated version of that. Instead of defining requests by endpoint and method, requests are defined as methods on objects or services. With good code generation, an RPC system is often easier and safer to use for both clients and servers.&#xA;&#xA;Why hRPC?&#xA;&#xA;hRPC uses REST to model plain unary requests, and WebSockets to model streaming requests. As such, it should be easy to write a library for the languages that don&#39;t already support it.&#xA;&#xA;hRPC features:&#xA;&#xA;Type safety&#xA;Strict protocol conformance on both ends&#xA;Easy streaming logic&#xA;More elegant server and client code with interfaces/traits and endpoint generation.&#xA;Cross-language code generation&#xA;Smaller request sizes&#xA;Faster request parsing&#xA;&#xA;Why Not Twirp?&#xA;&#xA;Twirp and hRPC have a lot in common, but the key difference that makes Twirp a dealbreaker for harmony is its lack of support for streaming RPCs. Harmony&#39;s vision was to represent all endpoints in Protobuf format, and as a result Twirp became fundamentally incompatible. &#xA;&#xA;Why Not gRPC?&#xA;&#xA;gRPC is the de-facto RPC system, in fact protobuf and gRPC come together a lot of the time. So the question is, why would you want to use hRPC instead?&#xA;&#xA;Unfortunately, gRPC has many limitations, and most of them result from its low-level nature.&#xA;&#xA;The lack of web support&#xA;&#xA;At Harmony, support for web-based clients was a must, as was keeping things simple to implement. gRPC had neither. As stated by gRPC:&#xA;  It is currently impossible to implement the HTTP/2 gRPC spec in the browser, as there is simply no browser API with enough fine-grained control over the requests.&#xA;&#xA;The gRPC slowloris&#xA;&#xA;gRPC streams are essentially just a long-running HTTP request. Whenever data needs to be sent, it just sends a new HTTP/2 frame. The issue with this, however, is that most reverse proxies do not understand gRPC streaming. At Harmony, it was fairly common for sockets to disconnect because they are idle for long stretches of time. NGINX and other reverse proxies would see these idle connections, and would close them, causing issues to all of our clients. hRPC&#39;s use of WebSockets solves this use-case, as reverse proxies are fully capable to understand them.&#xA;&#xA;In general, with hRPC we retain the bulk of gRPC&#39;s advantages while simplifying stuff massively.&#xA;&#xA;Why not plain REST?&#xA;&#xA;Protobuf provides a more compact binary format for requests than JSON. It lets the user to define a schema for their messages and RPCs which results in easy server and client code generation. Protobuf also has features that are very useful for these kind of schemas (such as extensions), and as such is a nice fit for hRPC.&#xA;&#xA;A Simple Chat Example&#xA;&#xA;Let&#39;s try out hRPC with a basic chat example. This is a simple system that supports posting chat messages which are then streamed back to all clients. Here is the protocol:&#xA;syntax = &#34;proto3&#34;;&#xA;&#xA;package chat;&#xA;&#xA;// Empty object which is used in place of nothing&#xA;message Empty { }&#xA;&#xA;// Object that represents a chat message&#xA;message Message { string content = 1; }&#xA;&#xA;service Chat {&#xA;  // Endpoint to send a chat message&#xA;  rpc SendMessage(Message) returns (Empty);&#xA;  // Endpoint to stream chat messages&#xA;  rpc StreamMessages(Empty) returns (stream Message);&#xA;}&#xA;&#xA;By the end, this is what we will have:&#xA;&#xA;vue client being demonstrated&#xA;&#xA;Getting Started&#xA;&#xA;NOTE: If you don&#39;t want to follow along, you can find the full server example at hRPC examples repository.&#xA;&#xA;Let&#39;s start by writing a server that implements this. We will use hrpc-rs, which is a Rust implementation of hRPC.&#xA;&#xA;Note: If you don&#39;t have Rust installed, you can install it from the rustup website.&#xA;&#xA;We get started with creating our project with cargo new chat-example --bin.&#xA;&#xA;Now we will need to add a few dependencies to Cargo.toml:&#xA;&#xA;[build-dependencies]&#xA;hrpc-build will handle generating Protobuf code for us&#xA;The features we enable here matches the ones we enable for hrpc&#xA;hrpc-build = { version = &#34;0.29&#34;, features = [&#34;server&#34;, &#34;recommended&#34;] }&#xA;&#xA;[dependencies]&#xA;prost provides us with protobuf decoding and encoding&#xA;prost = &#34;0.9&#34;&#xA;hrpc is the hrpc-rs main crate!&#xA;Enable hrpc&#39;s server features, and the recommended transport&#xA;hrpc = { version = &#34;0.29&#34;, features = [&#34;server&#34;, &#34;recommended&#34;] }&#xA;tokio is the async runtime we use&#xA;Enable tokio&#39;s macros so we can mark our main function, and enable multi&#xA;threaded runtime&#xA;tokio = { version = &#34;1&#34;, features = [&#34;rt&#34;, &#34;rt-multi-thread&#34;, &#34;macros&#34;] }&#xA;tower-http is a collection of HTTP related middleware&#xA;tower-http = { version = &#34;0.1&#34;, features = [&#34;cors&#34;] }&#xA;Logging utilities&#xA;tracing gives us the ability to log from anywhere we want&#xA;tracing = &#34;0.1&#34;&#xA;tracing-subscriber gives us a terminal logger&#xA;tracing-subscriber = &#34;0.3&#34;&#xA;&#xA;Don&#39;t forget to check if your project compiles with cargo check!&#xA;&#xA;Building the Protobufs&#xA;&#xA;Now, let&#39;s get basic protobuf code generation working.&#xA;&#xA;First, go ahead and copy the chat protocol from earlier into src/chat.proto.&#xA;&#xA;After that we will need a build script. Make a file called build.rs in the root of the project:&#xA;// build.rs&#xA;fn main() {&#xA;    // The path here is the path to our protocol file&#xA;    // which we copied in the previous step!&#xA;    //&#xA;    // This will generate Rust code for our protobuf definitions.&#xA;    hrpcbuild::compileprotos(&#34;src/chat.proto&#34;)&#xA;        .expect(&#34;could not compile the proto&#34;);&#xA;}&#xA;&#xA;And lastly, we need to import the generated code:&#xA;// src/main.rs&#xA;// Our chat package generated code&#xA;pub mod chat {&#xA;    // This imports all the generated code for you&#xA;    hrpc::includeproto!(&#34;chat&#34;);&#xA;}&#xA;&#xA;// This is empty for now!&#xA;fn main() { }&#xA;&#xA;Now you can run cargo check to see if it compiles!&#xA;&#xA;Implementing the Protocol&#xA;&#xA;In this section, we will implement the protocol endpoints.&#xA;&#xA;First, get started by importing the stuff we will need:&#xA;// src/main.rs&#xA;// top of the file&#xA;&#xA;// Import everything from chat package, and the generated&#xA;// server trait&#xA;use chat::{, chatserver::};&#xA;// Import the server prelude, which contains&#xA;// often used code that is used to develop servers.&#xA;use hrpc::server::prelude::*;&#xA;&#xA;Now, let&#39;s define the business logic for the Chat server. This is a simple example, so we can just use channels from tokio::sync::broadcast. This will allow us to broadcast our chat messages to all clients connected.&#xA;&#xA;// ... other use statements&#xA;&#xA;// The channel we will use to broadcast our chat messages&#xA;use tokio::sync::broadcast;&#xA;&#xA;Afterwards we can define our service state:&#xA;&#xA;pub struct ChatService {&#xA;    // The sender half of our broadcast channel.&#xA;    // &#xA;    // We will use it&#39;s .subscribe() method to get a&#xA;    // receiver when a client connects.&#xA;    messagebroadcast: broadcast::SenderMessage,&#xA;}&#xA;&#xA;Then we define a simple constructor:&#xA;&#xA;impl ChatService {&#xA;    // Creates a new ChatService&#xA;    fn new() -  Self {&#xA;        // Create a broadcast channel with a maximum 100&#xA;        // amount of items that can be pending. This&#xA;        // doesn&#39;t matter in our case, so the number is&#xA;        // arbitrary.&#xA;        let (tx, ) = broadcast::channel(100);&#xA;        Self {&#xA;            messagebroadcast: tx,&#xA;        }&#xA;    }&#xA;}&#xA;&#xA;Now we need to implement the generated trait for our service:&#xA;&#xA;impl Chat for ChatService {&#xA;    // This corresponds to the SendMessage endpoint&#xA;    // &#xA;    // handler is a Rust macro that is used to transform&#xA;    // an async fn into a properly typed hRPC trait method.&#xA;    #[handler]&#xA;    async fn sendmessage(&amp;self, request: RequestMessage) -  ServerResultResponse&lt;Empty  {&#xA;        // we will add this in a bit&#xA;    }&#xA;    &#xA;    // This corresponds to the StreamMessages endpoint&#xA;    #[handler]&#xA;    async fn streammessages(&#xA;        &amp;self,&#xA;        // We don&#39;t use the request here, so we can just ignore it.&#xA;        // The leading `` stops Rust from complaining about unused&#xA;        // variables!&#xA;        request: Request(),&#xA;        socket: SocketMessage, Empty,&#xA;    ) -  ServerResult() {&#xA;        // we will add this in a bit&#xA;    }&#xA;}&#xA;&#xA;And now for the actual logic, let&#39;s start with message sending:&#xA;&#xA;[handler]&#xA;async fn sendmessage(&amp;self, request: RequestMessage) -  ServerResultResponse&lt;Empty  {&#xA;    // Extract the chat message from the request&#xA;    let message = request.intomessage().await?;&#xA;&#xA;    // Try to broadcast the chat message across the channel&#xA;    // if it fails return an error&#xA;    if self.messagebroadcast.send(message).iserr() {&#xA;        return Err(HrpcError::newinternalservererror(&#34;couldn&#39;t broadcast message&#34;));&#xA;    }&#xA;    &#xA;    // Log the message we just got&#xA;    tracing::info!(&#34;got message: {}&#34;, message.content);&#xA;&#xA;    Ok((Empty {}).intoresponse())&#xA;}&#xA;&#xA;Streaming logic is simple. Simply subscribe to the broadcast channel, and then read messages from that channel forever until there&#39;s an error:&#xA;[handler]&#xA;async fn streammessages(&#xA;    &amp;self,&#xA;    request: Request(),&#xA;    socket: SocketMessage, Empty,&#xA;) -  ServerResult() {&#xA;    // Subscribe to the message broadcaster&#xA;    let mut messagereceiver = self.messagebroadcast.subscribe();&#xA;&#xA;    // Poll for received messages...&#xA;    while let Ok(message) = messagereceiver.recv().await {&#xA;        // ...and send them to client.&#xA;        socket.sendmessage(message).await?;&#xA;    }&#xA;&#xA;    Ok(())&#xA;}&#xA;&#xA;Let&#39;s put all of this together in the main function. We&#39;ll make a new chat server, where we pass in our implementation of the service. We&#39;ll be serving using the Hyper HTTP transport for the server, although this can be swapped out with another transport if needed.&#xA;// ...other imports&#xA;&#xA;// Import our CORS middleware&#xA;use towerhttp::cors::CorsLayer;&#xA;&#xA;// Import the Hyper HTTP transport for hRPC&#xA;use hrpc::server::transport::http::Hyper;&#xA;&#xA;// tokio::main is a Rust macro that converts an async fn&#xA;// main function into a synchronous main function, and enables&#xA;// you to use the tokio async runtime. The runtime we use is the&#xA;// multithreaded runtime, which is what we want.&#xA;[tokio::main]&#xA;async fn main() -  Result(), BoxError {&#xA;    // Initialize the default logging in tracing-subscriber&#xA;    // which is logging to the terminal&#xA;    tracingsubscriber::fmt().init();&#xA;    &#xA;    // Create our chat service&#xA;    let service = ChatServer::new(ChatService::new());&#xA;&#xA;    // Create our transport that we will use to serve our service&#xA;    let transport = Hyper::new(&#34;127.0.0.1:2289&#34;)?;&#xA;&#xA;    // Layer our transport for use with CORS.&#xA;    // Since this is specific to HTTP, we use the transport&#39;s layer method.&#xA;    //&#xA;    // Note: A &#34;layer&#34; can simply be thought of as a middleware!&#xA;    let transport = transport.layer(CorsLayer::permissive());&#xA;&#xA;    // Serve our service with our transport&#xA;    transport.serve(service).await?;&#xA;&#xA;    Ok(())&#xA;}&#xA;&#xA;Notice how in the code above, we needed to specify a CORS layer. The next step of the process, of course, is to write a frontend for this.&#xA;&#xA;Frontend (CLI)&#xA;&#xA;If you don&#39;t want to use the web client example, you can try the CLI client at hRPC examples repository. Keep in mind that this post doesn&#39;t cover writing a CLI client.&#xA;&#xA;To run it, after you git clone the repository linked, navigate to chat/tui-client and run cargo run. Instructions also available in the READMEs in the repository.&#xA;&#xA;Frontend (Vue 3 + Vite + TS)&#xA;&#xA;NOTE: If you don&#39;t want to follow along, you can find the full web client example at hRPC examples repository.&#xA;&#xA;The setup is a basic Vite project using the Vue template, with all of the boilerplate demo code removed. Once you have the project made, install the following packages:&#xA;&#xA;npm i @protobuf-ts/runtime @protobuf-ts/runtime-rpc @harmony-dev/transport-hrpc&#xA;&#xA;npm i -D @protobuf-ts/plugin @protobuf-ts/protoc windicss vite-plugin-windicss&#xA;&#xA;In order to get Protobuf generation working, we&#39;ll use Buf, a tool specifically built for building protocol buffers. Start by making the following buf.gen.yaml:&#xA;version: v1&#xA;plugins:&#xA;  name: ts&#xA;    out: gen&#xA;    opt: generatedependencies,longtypestring&#xA;    path: ./node_modules/@protobuf-ts/plugin/bin/protoc-gen-ts&#xA;&#xA;The config above invokes the code generator we installed, and enables a string representation for longs, and generating code for builtin google types too.&#xA;&#xA;Now, paste the protocol from earlier into protocol/chat.proto in the root of the folder, and run buf generate ./protocol. If you see a gen folder appear, then the code generation worked! ✅&#xA;&#xA;The Implementation&#xA;&#xA;When building the UI, it&#39;s useful to have a live preview of our site. Run npm run dev in terminal which will start a new dev server.&#xA;&#xA;The entire implementation will be done in src/App.vue, the main Vue component for the site.&#xA;&#xA;For the business logic, we&#39;ll be using the new fancy and shiny Vue 3 script setup syntax. Start by defining it:&#xA;script setup lang=&#34;ts&#34;&#xA;/script&#xA;&#xA;Now, inside this block, we first create a chat client by passing our client configuration into the HrpcTransport constructor:&#xA;import { ChatClient } from &#34;../gen/chat.client&#34;;&#xA;import { HrpcTransport } from &#34;@harmony-dev/transport-hrpc&#34;;&#xA;&#xA;const client = new ChatClient(&#xA;  new HrpcTransport({&#xA;    baseUrl: &#34;http://127.0.0.1:2289&#34;,&#xA;    insecure: true&#xA;  })&#xA;);&#xA;&#xA;Next, we will define a reactive list of messages, and content of the text input:&#xA;const content = ref(&#34;&#34;);&#xA;const msgs = reactivestring);&#xA;&#xA;These refs are used in the UI, and these are what we&#39;ll ultimately need to use in order to reflect a change.&#xA;&#xA;Now let&#39;s add our API logic:&#xA;// when the component mounts (page loads)&#xA;onMounted(() =  {&#xA;  // start streaming messages&#xA;  client.streamMessages({}).responses.onMessage((msg) =  {&#xA;    // add the message to the list&#xA;    msgs.push(msg.content);&#xA;  });&#xA;});&#xA;&#xA;// keyboard handler for the input&#xA;const onKey = (ev: KeyboardEvent) =  {&#xA;  if (ev.key !== &#34;Enter&#34;) return; // only send a message on enter&#xA;  client.sendMessage({&#xA;    content: content.value,&#xA;  }); // send a message to the server&#xA;  content.value = &#34;&#34;; // clear the textbox later&#xA;};&#xA;&#xA;Now let&#39;s add some layouting and styling, with registered event handlers for the input and a v-for loop to display the messages:&#xA;&#xA;template&#xA;  div class=&#34;h-100vh w-100vw bg-surface-900 flex flex-col justify-center p-3&#34;&#xA;    div class=&#34;flex-1 p-3 flex flex-col gap-2 overflow-auto&#34;&#xA;      p class=&#34;p-3 max-w-30ch rounded-md bg-surface-800&#34; v-for=&#34;m in msgs&#34; :key=&#34;m&#34;{{ m }}/p&#xA;    /div&#xA;    &lt;input&#xA;      class=&#34;&#xA;        p-2&#xA;        bg-surface-700&#xA;        rounded-md&#xA;        focus:outline-none focus:ring-3&#xA;        ring-secondary-400&#xA;&#x9;mt-2&#xA;      &#34;&#xA;      v-model=&#34;content&#34;&#xA;      @keydown=&#34;send&#34;&#xA;    /  /div&#xA;/template&#xA;&#xA;If you are unsure what these classes mean, take a look at WindiCSS to learn more.&#xA;&#xA;And with that we complete our chat application!&#xA;&#xA;Other Implementations&#xA;&#xA;While we used Rust for server and TypeScript for client here, hRPC is cross-language. The harmony-development organisation on GitHub has other implementations, most located in the hRPC repo.&#xA;&#xA;[hrpc spec]: https://github.com/harmony-development/hrpc/blob/main/protocol/SPEC.md&#xA;&#xA;Tags: #libre #harmony]]&gt;</description>
      <content:encoded><![CDATA[<blockquote><p><em>This is the sequel post to <a href="https://blog.blackquill.cc/hrpc-and-why-we-moved-away-from-grpc" rel="nofollow">the previous post on hRPC</a>; after hRPC has matured and gotten an actual spec written for it. This post contains much more information about hRPC, as well as tutorials for it.</em></p>

<p>This post is mirrored from <a href="https://dev.to/harmonydevelopment/introducing-hrpc-a-simple-rpc-system-for-user-facing-apis-16ge" rel="nofollow">our post on dev.to</a></p></blockquote>

<p><em>Co-authored by: Yusdacra (<a href="https://github.com/yusdacra" rel="nofollow">yusdacra@GitHub</a>), Bluskript (<a href="https://github.com/Bluskript" rel="nofollow">Bluskript@GitHub</a>), Pontaoski (<a href="https://github.com/pontaoski" rel="nofollow">pontaoski@GitHub</a>)</em></p>

<p><strong>hRPC is a new RPC system that we, at <a href="https://github.com/harmony-development" rel="nofollow">Harmony</a>, have been developing and using for our decentralized chat protocol. It uses Protocol Buffers (Protobufs) as a wire format, and supports streaming</strong>.</p>

<p>hRPC is primarily made for user-facing APIs and aims to be as simple to use as possible.</p>

<p>If you would like to learn more, the hRPC specification can be found <a href="https://github.com/harmony-development/hrpc/blob/main/protocol/SPEC.md" rel="nofollow">here</a>.</p>

<blockquote><p><strong>What is an RPC system?</strong>
If you know traditional API models like REST, then you can think of RPC as a more integrated version of that. Instead of defining requests by endpoint and method, requests are defined as methods on objects or services. With good code generation, an RPC system is often easier and safer to use for both clients and servers.</p></blockquote>

<h2 id="why-hrpc">Why hRPC?</h2>

<p>hRPC uses REST to model plain unary requests, and WebSockets to model streaming requests. As such, it should be easy to write a library for the languages that don&#39;t already support it.</p>

<p>hRPC features:</p>
<ul><li>Type safety</li>
<li>Strict protocol conformance on both ends</li>
<li>Easy streaming logic</li>
<li>More elegant server and client code with interfaces/traits and endpoint generation.</li>
<li>Cross-language code generation</li>
<li>Smaller request sizes</li>
<li>Faster request parsing</li></ul>

<h2 id="why-not-twirp">Why Not Twirp?</h2>

<p>Twirp and hRPC have a lot in common, but the key difference that makes Twirp a dealbreaker for harmony is its lack of support for <strong>streaming RPCs</strong>. Harmony&#39;s vision was to represent all endpoints in Protobuf format, and as a result Twirp became fundamentally incompatible.</p>

<h2 id="why-not-grpc">Why Not gRPC?</h2>

<p>gRPC is the de-facto RPC system, in fact protobuf and gRPC come together a lot of the time. So the question is, why would you want to use hRPC instead?</p>

<p>Unfortunately, gRPC has many limitations, and most of them result from its low-level nature.</p>

<h3 id="the-lack-of-web-support">The lack of web support</h3>

<p>At Harmony, support for web-based clients was a must, as was keeping things simple to implement. gRPC had neither. As stated by gRPC:
&gt; It is currently impossible to implement the HTTP/2 gRPC spec in the browser, as there is simply no browser API with enough fine-grained control over the requests.</p>

<h3 id="the-grpc-slowloris">The gRPC slowloris</h3>

<p>gRPC streams are essentially just a long-running HTTP request. Whenever data needs to be sent, it just sends a new HTTP/2 frame. The issue with this, however, is that <strong>most reverse proxies do not understand gRPC streaming.</strong> At Harmony, it was fairly common for sockets to disconnect because they are idle for long stretches of time. NGINX and other reverse proxies would see these idle connections, and would close them, causing issues to all of our clients. hRPC&#39;s use of WebSockets solves this use-case, as reverse proxies are fully capable to understand them.</p>

<p>In general, with hRPC we retain the bulk of gRPC&#39;s advantages while simplifying stuff massively.</p>

<h2 id="why-not-plain-rest">Why not plain REST?</h2>

<p>Protobuf provides a more compact binary format for requests than JSON. It lets the user to define a schema for their messages and RPCs which results in easy server and client code generation. Protobuf also has features that are very useful for these kind of schemas (such as extensions), and as such is a nice fit for hRPC.</p>

<h2 id="a-simple-chat-example">A Simple Chat Example</h2>

<p>Let&#39;s try out hRPC with a basic chat example. This is a simple system that supports posting chat messages which are then streamed back to all clients. Here is the protocol:</p>

<pre><code class="language-protobuf">syntax = &#34;proto3&#34;;

package chat;

// Empty object which is used in place of nothing
message Empty { }

// Object that represents a chat message
message Message { string content = 1; }

service Chat {
  // Endpoint to send a chat message
  rpc SendMessage(Message) returns (Empty);
  // Endpoint to stream chat messages
  rpc StreamMessages(Empty) returns (stream Message);
}
</code></pre>

<p>By the end, this is what we will have:</p>

<p><img src="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iasl6fvuswwod99f09nu.gif" alt="vue client being demonstrated"></p>

<h3 id="getting-started">Getting Started</h3>

<p><strong>NOTE</strong>: If you don&#39;t want to follow along, you can find the full server example at <a href="https://github.com/harmony-development/hrpc-examples/tree/main/chat" rel="nofollow">hRPC examples repository</a>.</p>

<p>Let&#39;s start by writing a server that implements this. We will use <a href="https://github.com/harmony-development/hrpc-rs" rel="nofollow">hrpc-rs</a>, which is a Rust implementation of hRPC.</p>

<p><strong>Note:</strong> If you don&#39;t have Rust installed, you can install it from <a href="https://rustup.rs/" rel="nofollow">the rustup website</a>.</p>

<p>We get started with creating our project with <code>cargo new chat-example --bin</code>.</p>

<p>Now we will need to add a few dependencies to <code>Cargo.toml</code>:</p>

<pre><code class="language-toml">[build-dependencies]
# `hrpc-build` will handle generating Protobuf code for us
# The features we enable here matches the ones we enable for `hrpc`
hrpc-build = { version = &#34;0.29&#34;, features = [&#34;server&#34;, &#34;recommended&#34;] }

[dependencies]
# `prost` provides us with protobuf decoding and encoding
prost = &#34;0.9&#34;
# `hrpc` is the `hrpc-rs` main crate!
# Enable hrpc&#39;s server features, and the recommended transport
hrpc = { version = &#34;0.29&#34;, features = [&#34;server&#34;, &#34;recommended&#34;] }
# `tokio` is the async runtime we use
# Enable tokio&#39;s macros so we can mark our main function, and enable multi
# threaded runtime
tokio = { version = &#34;1&#34;, features = [&#34;rt&#34;, &#34;rt-multi-thread&#34;, &#34;macros&#34;] }
# `tower-http` is a collection of HTTP related middleware
tower-http = { version = &#34;0.1&#34;, features = [&#34;cors&#34;] }
# Logging utilities
# `tracing` gives us the ability to log from anywhere we want
tracing = &#34;0.1&#34;
# `tracing-subscriber` gives us a terminal logger
tracing-subscriber = &#34;0.3&#34;
</code></pre>

<p>Don&#39;t forget to check if your project compiles with <code>cargo check</code>!</p>

<h3 id="building-the-protobufs">Building the Protobufs</h3>

<p>Now, let&#39;s get basic protobuf code generation working.</p>

<p>First, go ahead and copy the chat protocol from earlier into <code>src/chat.proto</code>.</p>

<p>After that we will need a build script. Make a file called <code>build.rs</code> in the root of the project:</p>

<pre><code class="language-rust">// build.rs
fn main() {
    // The path here is the path to our protocol file
    // which we copied in the previous step!
    //
    // This will generate Rust code for our protobuf definitions.
    hrpc_build::compile_protos(&#34;src/chat.proto&#34;)
        .expect(&#34;could not compile the proto&#34;);
}
</code></pre>

<p>And lastly, we need to import the generated code:</p>

<pre><code class="language-rust">// src/main.rs
// Our chat package generated code
pub mod chat {
    // This imports all the generated code for you
    hrpc::include_proto!(&#34;chat&#34;);
}

// This is empty for now!
fn main() { }
</code></pre>

<p>Now you can run <code>cargo check</code> to see if it compiles!</p>

<h3 id="implementing-the-protocol">Implementing the Protocol</h3>

<p>In this section, we will implement the protocol endpoints.</p>

<p>First, get started by importing the stuff we will need:</p>

<pre><code class="language-rust">// src/main.rs
// top of the file

// Import everything from chat package, and the generated
// server trait
use chat::{*, chat_server::*};
// Import the server prelude, which contains
// often used code that is used to develop servers.
use hrpc::server::prelude::*;
</code></pre>

<p>Now, let&#39;s define the business logic for the Chat server. This is a simple example, so we can just use channels from <code>tokio::sync::broadcast</code>. This will allow us to broadcast our chat messages to all clients connected.</p>

<pre><code class="language-rust">// ... other `use` statements

// The channel we will use to broadcast our chat messages
use tokio::sync::broadcast;
</code></pre>

<p>Afterwards we can define our service state:</p>

<pre><code class="language-rust">pub struct ChatService {
    // The sender half of our broadcast channel.
    // 
    // We will use it&#39;s `.subscribe()` method to get a
    // receiver when a client connects.
    message_broadcast: broadcast::Sender&lt;Message&gt;,
}
</code></pre>

<p>Then we define a simple constructor:</p>

<pre><code class="language-rust">impl ChatService {
    // Creates a new `ChatService`
    fn new() -&gt; Self {
        // Create a broadcast channel with a maximum 100
        // amount of items that can be pending. This
        // doesn&#39;t matter in our case, so the number is
        // arbitrary.
        let (tx, _) = broadcast::channel(100);
        Self {
            message_broadcast: tx,
        }
    }
}
</code></pre>

<p>Now we need to implement the generated trait for our service:</p>

<pre><code class="language-rust">impl Chat for ChatService {
    // This corresponds to the SendMessage endpoint
    // 
    // `handler` is a Rust macro that is used to transform
    // an `async fn` into a properly typed hRPC trait method.
    #[handler]
    async fn send_message(&amp;self, request: Request&lt;Message&gt;) -&gt; ServerResult&lt;Response&lt;Empty&gt;&gt; {
        // we will add this in a bit
    }
    
    // This corresponds to the StreamMessages endpoint
    #[handler]
    async fn stream_messages(
        &amp;self,
        // We don&#39;t use the request here, so we can just ignore it.
        // The leading `_` stops Rust from complaining about unused
        // variables!
        _request: Request&lt;()&gt;,
        socket: Socket&lt;Message, Empty&gt;,
    ) -&gt; ServerResult&lt;()&gt; {
        // we will add this in a bit
    }
}
</code></pre>

<p>And now for the actual logic, let&#39;s start with message sending:</p>

<pre><code class="language-rust">#[handler]
async fn send_message(&amp;self, request: Request&lt;Message&gt;) -&gt; ServerResult&lt;Response&lt;Empty&gt;&gt; {
    // Extract the chat message from the request
    let message = request.into_message().await?;

    // Try to broadcast the chat message across the channel
    // if it fails return an error
    if self.message_broadcast.send(message).is_err() {
        return Err(HrpcError::new_internal_server_error(&#34;couldn&#39;t broadcast message&#34;));
    }
    
    // Log the message we just got
    tracing::info!(&#34;got message: {}&#34;, message.content);

    Ok((Empty {}).into_response())
}
</code></pre>

<p>Streaming logic is simple. Simply subscribe to the broadcast channel, and then read messages from that channel forever until there&#39;s an error:</p>

<pre><code class="language-rust">#[handler]
async fn stream_messages(
    &amp;self,
    _request: Request&lt;()&gt;,
    socket: Socket&lt;Message, Empty&gt;,
) -&gt; ServerResult&lt;()&gt; {
    // Subscribe to the message broadcaster
    let mut message_receiver = self.message_broadcast.subscribe();

    // Poll for received messages...
    while let Ok(message) = message_receiver.recv().await {
        // ...and send them to client.
        socket.send_message(message).await?;
    }

    Ok(())
}
</code></pre>

<p>Let&#39;s put all of this together in the <code>main</code> function. We&#39;ll make a new chat server, where we pass in our implementation of the service. We&#39;ll be serving using the Hyper HTTP transport for the server, although this can be swapped out with another transport if needed.</p>

<pre><code class="language-rust">// ...other imports

// Import our CORS middleware
use tower_http::cors::CorsLayer;

// Import the Hyper HTTP transport for hRPC
use hrpc::server::transport::http::Hyper;

// `tokio::main` is a Rust macro that converts an `async fn`
// `main` function into a synchronous `main` function, and enables
// you to use the `tokio` async runtime. The runtime we use is the
// multithreaded runtime, which is what we want.
#[tokio::main]
async fn main() -&gt; Result&lt;(), BoxError&gt; {
    // Initialize the default logging in `tracing-subscriber`
    // which is logging to the terminal
    tracing_subscriber::fmt().init();
    
    // Create our chat service
    let service = ChatServer::new(ChatService::new());

    // Create our transport that we will use to serve our service
    let transport = Hyper::new(&#34;127.0.0.1:2289&#34;)?;

    // Layer our transport for use with CORS.
    // Since this is specific to HTTP, we use the transport&#39;s layer method.
    //
    // Note: A &#34;layer&#34; can simply be thought of as a middleware!
    let transport = transport.layer(CorsLayer::permissive());

    // Serve our service with our transport
    transport.serve(service).await?;

    Ok(())
}
</code></pre>

<p>Notice how in the code above, we needed to specify a CORS layer. The next step of the process, of course, is to write a frontend for this.</p>

<h3 id="frontend-cli">Frontend (CLI)</h3>

<p>If you don&#39;t want to use the web client example, you can try the CLI client <a href="https://github.com/harmony-development/hrpc-examples/tree/main/chat" rel="nofollow">at hRPC examples repository</a>. Keep in mind that this post doesn&#39;t cover writing a CLI client.</p>

<p>To run it, after you <code>git clone</code> the repository linked, navigate to <code>chat/tui-client</code> and run <code>cargo run</code>. Instructions also available in the READMEs in the repository.</p>

<h3 id="frontend-vue-3-vite-ts">Frontend (Vue 3 + Vite + TS)</h3>

<p><strong>NOTE</strong>: If you don&#39;t want to follow along, you can find the full web client example at <a href="https://github.com/harmony-development/hrpc-examples/tree/main/chat" rel="nofollow">hRPC examples repository</a>.</p>

<p>The setup is a basic Vite project using the Vue template, with all of the boilerplate demo code removed. Once you have the project made, install the following packages:</p>

<p><code>npm i @protobuf-ts/runtime @protobuf-ts/runtime-rpc @harmony-dev/transport-hrpc</code></p>

<p><code>npm i -D @protobuf-ts/plugin @protobuf-ts/protoc windicss vite-plugin-windicss</code></p>

<p>In order to get Protobuf generation working, we&#39;ll use <a href="https://buf.build" rel="nofollow">Buf</a>, a tool specifically built for building protocol buffers. Start by making the following <code>buf.gen.yaml</code>:</p>

<pre><code class="language-yaml">version: v1
plugins:
  - name: ts
    out: gen
    opt: generate_dependencies,long_type_string
    path: ./node_modules/@protobuf-ts/plugin/bin/protoc-gen-ts
</code></pre>

<p>The config above invokes the code generator we installed, and enables a string representation for longs, and generating code for builtin google types too.</p>

<p>Now, paste the protocol from earlier into <code>protocol/chat.proto</code> in the root of the folder, and run <code>buf generate ./protocol</code>. If you see a <code>gen</code> folder appear, then the code generation worked! ✅</p>

<h4 id="the-implementation">The Implementation</h4>

<p>When building the UI, it&#39;s useful to have a live preview of our site. Run <code>npm run dev</code> in terminal which will start a new dev server.</p>

<p>The entire implementation will be done in <code>src/App.vue</code>, the main Vue component for the site.</p>

<p>For the business logic, we&#39;ll be using the new fancy and shiny Vue 3 <a href="https://v3.vuejs.org/api/sfc-script-setup.html" rel="nofollow">script setup syntax</a>. Start by defining it:</p>

<pre><code class="language-html">&lt;script setup lang=&#34;ts&#34;&gt;
&lt;/script&gt;
</code></pre>

<p>Now, inside this block, we first create a chat client by passing our client configuration into the HrpcTransport constructor:</p>

<pre><code class="language-typescript">import { ChatClient } from &#34;../gen/chat.client&#34;;
import { HrpcTransport } from &#34;@harmony-dev/transport-hrpc&#34;;

const client = new ChatClient(
  new HrpcTransport({
    baseUrl: &#34;http://127.0.0.1:2289&#34;,
    insecure: true
  })
);
</code></pre>

<p>Next, we will <strong>define a reactive list of messages, and content of the text input</strong>:</p>

<pre><code class="language-typescript">const content = ref(&#34;&#34;);
const msgs = reactive&lt;string[]&gt;([]);
</code></pre>

<p>These refs are used in the UI, and these are what we&#39;ll ultimately need to use in order to reflect a change.</p>

<p>Now let&#39;s add our API logic:</p>

<pre><code class="language-typescript">// when the component mounts (page loads)
onMounted(() =&gt; {
  // start streaming messages
  client.streamMessages({}).responses.onMessage((msg) =&gt; {
    // add the message to the list
    msgs.push(msg.content);
  });
});

// keyboard handler for the input
const onKey = (ev: KeyboardEvent) =&gt; {
  if (ev.key !== &#34;Enter&#34;) return; // only send a message on enter
  client.sendMessage({
    content: content.value,
  }); // send a message to the server
  content.value = &#34;&#34;; // clear the textbox later
};
</code></pre>

<p>Now let&#39;s add some layouting and styling, with registered event handlers for the input and a <code>v-for</code> loop to display the messages:</p>

<pre><code class="language-html">&lt;template&gt;
  &lt;div class=&#34;h-100vh w-100vw bg-surface-900 flex flex-col justify-center p-3&#34;&gt;
    &lt;div class=&#34;flex-1 p-3 flex flex-col gap-2 overflow-auto&#34;&gt;
      &lt;p class=&#34;p-3 max-w-30ch rounded-md bg-surface-800&#34; v-for=&#34;m in msgs&#34; :key=&#34;m&#34;&gt;{{ m }}&lt;/p&gt;
    &lt;/div&gt;
    &lt;input
      class=&#34;
        p-2
        bg-surface-700
        rounded-md
        focus:outline-none focus:ring-3
        ring-secondary-400
	mt-2
      &#34;
      v-model=&#34;content&#34;
      @keydown=&#34;send&#34;
    /&gt;
  &lt;/div&gt;
&lt;/template&gt;
</code></pre>

<p>If you are unsure what these classes mean, take a look at <a href="https://windicss.org" rel="nofollow">WindiCSS</a> to learn more.</p>

<p>And with that we complete our chat application!</p>

<h2 id="other-implementations">Other Implementations</h2>

<p>While we used Rust for server and TypeScript for client here, hRPC is cross-language. The <a href="https://github.com/harmony-development" rel="nofollow">harmony-development organisation on GitHub</a> has other implementations, most located in the <a href="https://github.com/harmony-development/hrpc" rel="nofollow">hRPC repo</a>.</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a> <a href="https://blog.blackquill.cc/tag:harmony" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">harmony</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/introducing-hrpc-a-simple-rpc-system-for-user-facing-apis</guid>
      <pubDate>Mon, 15 Nov 2021 01:14:22 +0000</pubDate>
    </item>
    <item>
      <title>QML-LSP: A Simple LSP Server For QML</title>
      <link>https://blog.blackquill.cc/qml-lsp-a-simple-lsp-server-for-qml</link>
      <description>&lt;![CDATA[kate showing a completion for qml&#xA;&#xA;kate showing another completion for qml&#xA;&#xA;qml-lsp is a simple LSP server for QML. it currently only offers primitive non-context-aware completions, without anything else. it understands:&#xA;&#xA;completing component names&#xA;completing properties defined in surrounding component (not properties from its superclasses)&#xA;completing enums&#xA;&#xA;note that it doesn&#39;t understand project-local QML components, only stuff installed to the system QML dirs with qmlplugindump set up correctly.&#xA;&#xA;regardless, it does a decent job at completing system types, and considering that Qt Creator struggles with some of the plugins that qml-lsp has no problem with, it&#39;s pretty usable and an improvement over the nothing found in editors other than Qt Creator.&#xA;&#xA;you can check out the source at https://invent.kde.org/cblack/qew-em-el-el-ess-pee, or fetch a statically linked binary here. plonk the binary in your PATH under the name qml-lsp.&#xA;&#xA;kate LSP config:&#xA;&#x9;&#34;qml&#34;: {&#xA;&#x9;&#x9;&#34;command&#34;: [&#34;qml-lsp&#34;],&#xA;&#x9;&#x9;&#34;highlightingModeRegex&#34;: &#34;^QML$&#34;&#xA;&#x9;}&#xA;&#xA;tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://img.blackquill.cc/the%20alignery.png" alt="kate showing a completion for qml"></p>

<p><img src="https://img.blackquill.cc/cope%20seethery.png" alt="kate showing another completion for qml"></p>

<p>qml-lsp is a simple LSP server for QML. it currently only offers primitive non-context-aware completions, without anything else. it understands:</p>
<ul><li>completing component names</li>
<li>completing properties defined in surrounding component (not properties from its superclasses)</li>
<li>completing enums</li></ul>

<p>note that it doesn&#39;t understand project-local QML components, only stuff installed to the system QML dirs with qmlplugindump set up correctly.</p>

<p>regardless, it does a decent job at completing system types, and considering that Qt Creator struggles with some of the plugins that qml-lsp has no problem with, it&#39;s pretty usable and an improvement over the nothing found in editors other than Qt Creator.</p>

<p>you can check out the source at <a href="https://invent.kde.org/cblack/qew-em-el-el-ess-pee" rel="nofollow">https://invent.kde.org/cblack/qew-em-el-el-ess-pee</a>, or fetch a statically linked binary <a href="https://invent.kde.org/cblack/qew-em-el-el-ess-pee/uploads/1c85ba8870a4aa24a7ccd6d17939bce9/qml-lsp-static" rel="nofollow">here</a>. plonk the binary in your PATH under the name <code>qml-lsp</code>.</p>

<p>kate LSP config:</p>

<pre><code class="language-json">	&#34;qml&#34;: {
		&#34;command&#34;: [&#34;qml-lsp&#34;],
		&#34;highlightingModeRegex&#34;: &#34;^QML$&#34;
	}
</code></pre>

<p>tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/qml-lsp-a-simple-lsp-server-for-qml</guid>
      <pubDate>Thu, 04 Nov 2021 03:47:21 +0000</pubDate>
    </item>
    <item>
      <title>A Hoppy Week In Tok</title>
      <link>https://blog.blackquill.cc/a-hoppy-week-in-tok</link>
      <description>&lt;![CDATA[the messages view showing a file thumbnail&#xA;&#xA;Tok has seen a handful of improvements this week, one being that file messages now show a thumbnail if available.&#xA;&#xA;Music View&#xA;&#xA;music view&#xA;&#xA;Tok can now show you all songs that have been uploaded to a chat.&#xA;&#xA;Hop To Message&#xA;&#xA;Tok now allows you to hop to a message by tapping on it, no matter how far back in history it is.&#xA;&#xA;Bugfixes:&#xA;&#xA;messages with newlines can be sent&#xA;formatting works again&#xA;nix files updated&#xA;the message text field no longer keeps the last formatting applied when you send a message&#xA;&#xA;Obtaining Tok&#xA;&#xA;Tok can be built from source from https://invent.kde.org/network/tok.&#xA;&#xA;There&#39;s a Telegram room for Tok available at https://t.me/kdetok, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.&#xA;&#xA;Contributing&#xA;&#xA;Interested in contributing? Come on by the dev chat and say hello!&#xA;&#xA;Tags: #libre #tok]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://img.blackquill.cc/screenshot%20memery.png" alt="the messages view showing a file thumbnail"></p>

<p>Tok has seen a handful of improvements this week, one being that file messages now show a thumbnail if available.</p>

<h2 id="music-view">Music View</h2>

<p><img src="https://img.blackquill.cc/music%20view%20in%20tok.png" alt="music view"></p>

<p>Tok can now show you all songs that have been uploaded to a chat.</p>

<h2 id="hop-to-message">Hop To Message</h2>

<p>Tok now allows you to hop to a message by tapping on it, no matter how far back in history it is.</p>

<h2 id="bugfixes">Bugfixes:</h2>
<ul><li>messages with newlines can be sent</li>
<li>formatting works again</li>
<li>nix files updated</li>
<li>the message text field no longer keeps the last formatting applied when you send a message</li></ul>

<h2 id="obtaining-tok">Obtaining Tok</h2>

<p>Tok can be built from source from <a href="https://invent.kde.org/network/tok" rel="nofollow">https://invent.kde.org/network/tok</a>.</p>

<p>There&#39;s a Telegram room for Tok available at <a href="https://t.me/kdetok" rel="nofollow">https://t.me/kdetok</a>, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.</p>

<h2 id="contributing">Contributing</h2>

<p>Interested in contributing? Come on by the dev chat and say hello!</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a> <a href="https://blog.blackquill.cc/tag:tok" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">tok</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/a-hoppy-week-in-tok</guid>
      <pubDate>Sat, 28 Aug 2021 16:22:43 +0000</pubDate>
    </item>
    <item>
      <title>Way More Than A Week In Tok</title>
      <link>https://blog.blackquill.cc/way-more-than-a-week-in-tok</link>
      <description>&lt;![CDATA[syntax highlighting being demonstrated in tok&#xA;&#xA;Tok has had many changes since the last time I made one of these blog posts, the biggest one being that code blocks are syntax highlighted!&#xA;&#xA;Tok uses KSyntaxHighlighting, the same syntax highlighting engine that powers Kate, KWrite, and other KDE applications that feature syntax highlighting.&#xA;&#xA;Additionally, messages containing codeblocks are able to grow horizontally in width beyond the usual message size, letting you read horizontally wide code easier.&#xA;&#xA;Emoji Completion&#xA;&#xA;emoji completion&#xA;&#xA;Tok now displays autocompletion for :emojis:, making the process of typing in emojis much more seamless.&#xA;&#xA;Edited Indicator&#xA;&#xA;edited indicator&#xA;&#xA;Tok now indicates when a message has been edited by the sender.&#xA;&#xA;Jump To Start Buttons&#xA;&#xA;jump to start&#xA;&#xA;Tok now has buttons that allow you to hop back to the start of various views, such as the chats list and the messages view.&#xA;&#xA;Improved In-Window Menubar&#xA;&#xA;better in-window menubar, featuring visual changes&#xA;&#xA;Tok&#39;s in-window menubar now has various improvements, such as using the colour of the rest of the header area, as well as the right sidebar respecting the menubar&#39;s appearance.&#xA;&#xA;Proxies&#xA;&#xA;proxy support&#xA;&#xA;Tok now supports configuring proxies, allowing you to access Telegram in countries that don&#39;t want you to access Telegram. &#xA;&#xA;And in true anticonvergent fashion, Tok has a dedicated mobile UI for proxies instead of simply using the desktop UI on mobile or the mobile UI on desktop.&#xA;&#xA;proxies on mobile&#xA;&#xA;Better Notifications&#xA;&#xA;better notifications&#xA;&#xA;Tok notifications now display more information about the message, support more message types, and display the profile picture of the chat you&#39;re receiving the notification from.&#xA;&#xA;Link Mentions&#xA;&#xA;mentions&#xA;&#xA;Tok now renders mentions as links that display the user&#39;s profile when clicked.&#xA;&#xA;Minor UI Improvements&#xA;&#xA;Chat list unread indicators now become pill-shaped whenever they grow horizontally.&#xA;&#xA;The typing indicator in the chat list is now accent-coloured, like the typing indicator in the header.&#xA;&#xA;Translation Fixes&#xA;&#xA;Tok now correctly loads translation files, allowing it to render in non-English languages.&#xA;&#xA;Optimisations&#xA;&#xA;Tok&#39;s startup time has been optimised by a few hundred milliseconds.&#xA;&#xA;Bugfixes&#xA;&#xA;Tok no longer resets the scroll position of the chats list whenever chats are moved.&#xA;&#xA;Alt-Up, Alt-Down, and Ctrl-K keyboard shortcuts work again.&#xA;&#xA;Formatting is generally less buggier, and message formats better match how they&#39;re supposed to look.&#xA;&#xA;Obtaining Tok&#xA;&#xA;Tok can be built from source from https://invent.kde.org/network/tok.&#xA;&#xA;There&#39;s a Telegram room for Tok available at https://t.me/kdetok, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.&#xA;&#xA;Contributing&#xA;&#xA;Interested in contributing? Come on by the dev chat and say hello!&#xA;&#xA;Tags: #libre&#xA;]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://img.blackquill.cc/vollbilder.png" alt="syntax highlighting being demonstrated in tok"></p>

<p>Tok has had many changes since the last time I made one of these blog posts, the biggest one being that code blocks are syntax highlighted!</p>

<p>Tok uses KSyntaxHighlighting, the same syntax highlighting engine that powers Kate, KWrite, and other KDE applications that feature syntax highlighting.</p>

<p>Additionally, messages containing codeblocks are able to grow horizontally in width beyond the usual message size, letting you read horizontally wide code easier.</p>

<h2 id="emoji-completion">Emoji Completion</h2>

<p><img src="https://img.blackquill.cc/emoji%20completion.png" alt="emoji completion"></p>

<p>Tok now displays autocompletion for :emojis:, making the process of typing in emojis much more seamless.</p>

<h2 id="edited-indicator">Edited Indicator</h2>

<p><img src="https://img.blackquill.cc/edited%20tag.png" alt="edited indicator"></p>

<p>Tok now indicates when a message has been edited by the sender.</p>

<h2 id="jump-to-start-buttons">Jump To Start Buttons</h2>

<p><img src="https://img.blackquill.cc/jump%20to%20start.png" alt="jump to start"></p>

<p>Tok now has buttons that allow you to hop back to the start of various views, such as the chats list and the messages view.</p>

<h2 id="improved-in-window-menubar">Improved In-Window Menubar</h2>

<p><img src="https://img.blackquill.cc/menubar%20improvements.png" alt="better in-window menubar, featuring visual changes"></p>

<p>Tok&#39;s in-window menubar now has various improvements, such as using the colour of the rest of the header area, as well as the right sidebar respecting the menubar&#39;s appearance.</p>

<h2 id="proxies">Proxies</h2>

<p><img src="https://img.blackquill.cc/proxies%20desktop.png" alt="proxy support"></p>

<p>Tok now supports configuring proxies, allowing you to access Telegram in countries that don&#39;t want you to access Telegram.</p>

<p>And in true anticonvergent fashion, Tok has a dedicated mobile UI for proxies instead of simply using the desktop UI on mobile or the mobile UI on desktop.</p>

<p><img src="https://img.blackquill.cc/design%20for%20the%20platform.png" alt="proxies on mobile"></p>

<h2 id="better-notifications">Better Notifications</h2>

<p><img src="https://img.blackquill.cc/better%20notificationen.png" alt="better notifications"></p>

<p>Tok notifications now display more information about the message, support more message types, and display the profile picture of the chat you&#39;re receiving the notification from.</p>

<h2 id="link-mentions">Link Mentions</h2>

<p><img src="https://img.blackquill.cc/sakon%20a.png" alt="mentions"></p>

<p>Tok now renders mentions as links that display the user&#39;s profile when clicked.</p>

<h2 id="minor-ui-improvements">Minor UI Improvements</h2>

<p>Chat list unread indicators now become pill-shaped whenever they grow horizontally.</p>

<p>The typing indicator in the chat list is now accent-coloured, like the typing indicator in the header.</p>

<h2 id="translation-fixes">Translation Fixes</h2>

<p>Tok now correctly loads translation files, allowing it to render in non-English languages.</p>

<h2 id="optimisations">Optimisations</h2>

<p>Tok&#39;s startup time has been optimised by a few hundred milliseconds.</p>

<h2 id="bugfixes">Bugfixes</h2>

<p>Tok no longer resets the scroll position of the chats list whenever chats are moved.</p>

<p>Alt-Up, Alt-Down, and Ctrl-K keyboard shortcuts work again.</p>

<p>Formatting is generally less buggier, and message formats better match how they&#39;re supposed to look.</p>

<h2 id="obtaining-tok">Obtaining Tok</h2>

<p>Tok can be built from source from <a href="https://invent.kde.org/network/tok" rel="nofollow">https://invent.kde.org/network/tok</a>.</p>

<p>There&#39;s a Telegram room for Tok available at <a href="https://t.me/kdetok" rel="nofollow">https://t.me/kdetok</a>, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.</p>

<h2 id="contributing">Contributing</h2>

<p>Interested in contributing? Come on by the dev chat and say hello!</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/way-more-than-a-week-in-tok</guid>
      <pubDate>Sat, 21 Aug 2021 02:03:08 +0000</pubDate>
    </item>
    <item>
      <title>Asynchronous QtQuick UIs and their implementation: The Toolbox</title>
      <link>https://blog.blackquill.cc/asynchronous-qtquick-uis-and-their-implementation-the-toolbox</link>
      <description>&lt;![CDATA[This blog post goes over the set of tools you have to work with when doing QtQuick UIs that need to perform something asynchronously.&#xA;&#xA;Data Reactivity&#xA;&#xA;Probably the oldest tool in the toolbox, as it&#39;s foundational to QML&#39;s construction, and one you should already know how to use.&#xA;&#xA;BooksModel {&#xA;    id: booksModel&#xA;}&#xA;Button {&#xA;    text: The model has ${booksModel.count} items&#xA;}&#xA;Button {&#xA;   onClicked: booksModel.refresh()&#xA;}&#xA;&#xA;QML is reactive, meaning that when you change a property, dependent bindings update automatically.&#xA;&#xA;When applied to asynchrony, this means that asynchronous events or requests can simply mutate an object&#39;s properties on completion, and the UI will change when it does.&#xA;&#xA;This is the most common form of asynchrony in a QtQuick UI, and it should be your first goto when you need to perform something asynchronously and display results in the UI, as it plays nicely with a declarative UI.&#xA;&#xA;This is generally the right pattern to use when you can describe the asynchronous process as &#34;the UI displays the current state of a model/object, which changes in response to events.&#34;&#xA;&#xA;Thennables&#xA;&#xA;The newest tool in the toolbox (at least in QML), Thennables play nicely with other cool and shiny things, such as C++20 Coroutines.&#xA;&#xA;Button {&#xA;    text: &#34;Do it&#34;&#xA;    onClicked: getNewButtonTextFromTheInternet().then((it) =  this.text = it)&#xA;}&#xA;&#xA;Implementations of the Thennable pattern can be found in Qt Automotive, as the QIviPendingReply family of classes, and in Croutons, as the Future family of classes. These allow you to make an object in C++ that you can return as a Thennable to QML after scheduling a task that&#39;ll fufill or reject the thennable from the C++ side. On the backend, it usually looks like this:&#xA;&#xA;QIviPendingReplybool foo;&#xA;&#xA;// runs asynchronously&#xA;client-  getStatus(foo mutable {&#xA;    foo.setSuccess(response);&#xA;});&#xA;&#xA;return foo;&#xA;&#xA;Or, if you&#39;re using Croutons + C++20 Coroutines...&#xA;auto response = coawait client-  getStatus();&#xA;&#xA;coreturn response.isOnline();&#xA;&#xA;This is generally the right pattern to use when you can describe the flow of asynchronous actions as an &#34;and then&#34; chain. For example, &#34;make a request and then display a notification to the user if it succeeds or fails&#34;. If the asynchronous process can take advantage of data reactivity, this is probably not the correct pattern to use.&#xA;&#xA;Task Objects&#xA;&#xA;This is rarely the right pattern, and people that have to use your API will probably hate you if you ship this in a library. Despite that, it&#39;s a (thankfully uncommon) thing in the wild, so I&#39;ll document it here.&#xA;&#xA;FileDownload {&#xA;    id: fileDownloader&#xA;&#xA;    url: &#34;http://placehold.it/350x150&#34;&#xA;&#xA;    onStarted: status.text = &#34;Starting download...&#34;&#xA;    onError: status.text = &#34;Download failed&#34;&#xA;    onProgressChanged: status.text = Download ${progress}% done&#xA;    onFinished: status.text = &#34;Download finished&#34;&#xA;}&#xA;Button {&#xA;&#x9;text: &#34;Download File&#34;&#xA;&#x9;onClicked: fileDownloader.start()&#xA;}&#xA;&#xA;This essentially consists of using a full-fledged object description in QML to describe a task, and to connect to signals on it. The object needs to be given an id, and that id needs to be used to invoke a .start() method or setting a running property to true.&#xA;&#xA;Task objects have many disadvantages:&#xA;trying to describe an imperative process directly with a declarative language, which goes as well as you&#39;d expect&#xA;require creating and binding properties for an extra object, which can add up fast, even if the object&#39;s task isn&#39;t actually triggered by a user action. e.g., 50 download buttons with corresponding FileDownload objects, and only a few of those buttons will ever get pressed.&#xA;non-UI matters in an otherwise UI-only language&#xA;&#xA;You can often recognise task objects by distinct &#34;input&#34; properties and &#34;output&#34; signals (less commonly, &#34;output&#34; properties) for a given task.&#xA;&#xA;Instead of using a task object, considering using a Thennable, or better utilising reactive data.&#xA;&#xA;Workers&#xA;&#xA;I&#39;ve honestly never seen the WorkerScript type get used in an actual program, so I&#39;m not sure what their practical application is. Despite that, these are another option for currency, so I&#39;ll describe their behaviour.&#xA;&#xA;A WorkerScript is a wrapper around the JS notion of a WebWorker, which runs in its own thread and receives and sends messages from the main thread. This is the only way to get data in and out of them, so you have to structure it as messages.&#xA;&#xA;Tags: #libre]]&gt;</description>
      <content:encoded><![CDATA[<p>This blog post goes over the set of tools you have to work with when doing QtQuick UIs that need to perform something asynchronously.</p>

<h2 id="data-reactivity">Data Reactivity</h2>

<p>Probably the oldest tool in the toolbox, as it&#39;s foundational to QML&#39;s construction, and one you should already know how to use.</p>

<pre><code class="language-qml">BooksModel {
    id: booksModel
}
Button {
    text: `The model has ${booksModel.count} items`
}
Button {
   onClicked: booksModel.refresh()
}
</code></pre>

<p>QML is reactive, meaning that when you change a property, dependent bindings update automatically.</p>

<p>When applied to asynchrony, this means that asynchronous events or requests can simply mutate an object&#39;s properties on completion, and the UI will change when it does.</p>

<p>This is the most common form of asynchrony in a QtQuick UI, and it should be your first goto when you need to perform something asynchronously and display results in the UI, as it plays nicely with a declarative UI.</p>

<p>This is generally the right pattern to use when you can describe the asynchronous process as “the UI displays the current state of a model/object, which changes in response to events.”</p>

<h2 id="thennables">Thennables</h2>

<p>The newest tool in the toolbox (at least in QML), Thennables play nicely with other cool and shiny things, such as C++20 Coroutines.</p>

<pre><code class="language-qml">Button {
    text: &#34;Do it&#34;
    onClicked: getNewButtonTextFromTheInternet().then((it) =&gt; this.text = it)
}
</code></pre>

<p>Implementations of the Thennable pattern can be found in Qt Automotive, as the QIviPendingReply family of classes, and in <a href="https://invent.kde.org/cblack/croutons" rel="nofollow">Croutons</a>, as the Future family of classes. These allow you to make an object in C++ that you can return as a Thennable to QML after scheduling a task that&#39;ll fufill or reject the thennable from the C++ side. On the backend, it usually looks like this:</p>

<pre><code class="language-c++">QIviPendingReply&lt;bool&gt; foo;

// runs asynchronously
client-&gt;getStatus([foo](bool isOnline) mutable {
    foo.setSuccess(response);
});

return foo;
</code></pre>

<p>Or, if you&#39;re using Croutons + C++20 Coroutines...</p>

<pre><code class="language-c++">auto response = co_await client-&gt;getStatus();

co_return response.isOnline();
</code></pre>

<p>This is generally the right pattern to use when you can describe the flow of asynchronous actions as an “and then” chain. For example, “make a request <strong>and then</strong> display a notification to the user if it succeeds or fails”. If the asynchronous process can take advantage of data reactivity, this is probably not the correct pattern to use.</p>

<h2 id="task-objects">Task Objects</h2>

<p>This is rarely the right pattern, and people that have to use your API will probably hate you if you ship this in a library. Despite that, it&#39;s a (thankfully uncommon) thing in the wild, so I&#39;ll document it here.</p>

<pre><code class="language-qml">FileDownload {
    id: fileDownloader

    url: &#34;http://placehold.it/350x150&#34;

    onStarted: status.text = &#34;Starting download...&#34;
    onError: status.text = &#34;Download failed&#34;
    onProgressChanged: status.text = `Download ${progress}% done`
    onFinished: status.text = &#34;Download finished&#34;
}
Button {
	text: &#34;Download File&#34;
	onClicked: fileDownloader.start()
}
</code></pre>

<p>This essentially consists of using a full-fledged object description in QML to describe a task, and to connect to signals on it. The object needs to be given an id, and that id needs to be used to invoke a <code>.start()</code> method or setting a <code>running</code> property to <code>true</code>.</p>

<p>Task objects have many disadvantages:
– trying to describe an imperative process directly with a declarative language, which goes as well as you&#39;d expect
– require creating and binding properties for an extra object, which can add up fast, even if the object&#39;s task isn&#39;t actually triggered by a user action. e.g., 50 download buttons with corresponding FileDownload objects, and only a few of those buttons will ever get pressed.
– non-UI matters in an otherwise UI-only language</p>

<p>You can often recognise task objects by distinct “input” properties and “output” signals (less commonly, “output” properties) for a given task.</p>

<p>Instead of using a task object, considering using a Thennable, or better utilising reactive data.</p>

<h2 id="workers">Workers</h2>

<p>I&#39;ve honestly never seen the WorkerScript type get used in an actual program, so I&#39;m not sure what their practical application is. Despite that, these are another option for currency, so I&#39;ll describe their behaviour.</p>

<p>A WorkerScript is a wrapper around the JS notion of a WebWorker, which runs in its own thread and receives and sends messages from the main thread. This is the only way to get data in and out of them, so you have to structure it as messages.</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/asynchronous-qtquick-uis-and-their-implementation-the-toolbox</guid>
      <pubDate>Wed, 04 Aug 2021 06:30:03 +0000</pubDate>
    </item>
    <item>
      <title>This Week In Tok: Days Of Work, Seconds Of Experience</title>
      <link>https://blog.blackquill.cc/this-week-in-tok-days-of-work-seconds-of-experience</link>
      <description>&lt;![CDATA[mobile upload dialogue&#xA;&#xA;Tok now has TWO new completely revamped upload dialogues: one for desktop, and one for mobile. Despite how much of the app is spent using stuff that isn&#39;t the upload dialogues, the upload dialogues took a LOT of time and work to get implemented correctly.&#xA;&#xA;The mobile dialogue makes it convenient to browse through your most recent photos, videos, music, and files to share them with your friends.&#xA;&#xA;video upload dialogue on mobile&#xA;&#xA;music upload dialogue on mobile&#xA;&#xA;files upload dialogue on mobile&#xA;&#xA;On desktop, the upload dialogue now offers a preview of your file, and for images, the option to compress it.&#xA;&#xA;photo upload on desktop&#xA;&#xA;video upload on desktop&#xA;&#xA;music upload on desktop&#xA;&#xA;file upload on desktop&#xA;&#xA;Sending State&#xA;&#xA;send state&#xA;&#xA;The sending state of outgoing messages is now displayed with a little icon by the timestamp.&#xA;&#xA;Adjusted Chat List Look&#xA;&#xA;adjusted chat list look&#xA;&#xA;The look of the chat list has been slightly adjusted, to make it look more like other KDE apps. More information is shown, such as the sending status of outgoing messages and the timestamp of the latest message.&#xA;&#xA;Improved Pasting&#xA;&#xA;Tok now has improved pasting capabilities, able to paste from apps that put images on the clipboard directly like Firefox and Spectacle now.&#xA;&#xA;Obtaining Tok&#xA;&#xA;Tok can be built from source from https://invent.kde.org/network/tok.&#xA;&#xA;There&#39;s a Telegram room for Tok available at https://t.me/kdetok, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.&#xA;&#xA;Contributing&#xA;&#xA;Interested in contributing? Come on by the dev chat and say hello!&#xA;&#xA;Tags: #libre #tok&#xA;&#xA;]]&gt;</description>
      <content:encoded><![CDATA[<p><img src="https://img.blackquill.cc/tok%20upload%20mobile.png" alt="mobile upload dialogue"></p>

<p>Tok now has TWO new completely revamped upload dialogues: one for desktop, and one for mobile. Despite how much of the app is spent using stuff that isn&#39;t the upload dialogues, the upload dialogues took a LOT of time and work to get implemented correctly.</p>

<p>The mobile dialogue makes it convenient to browse through your most recent photos, videos, music, and files to share them with your friends.</p>

<p><img src="https://img.blackquill.cc/tok%20upload%20mobile%20video.png" alt="video upload dialogue on mobile"></p>

<p><img src="https://img.blackquill.cc/tok%20upload%20mobile%20music.png" alt="music upload dialogue on mobile"></p>

<p><img src="https://img.blackquill.cc/tok%20upload%20mobile%20files.png" alt="files upload dialogue on mobile"></p>

<p>On desktop, the upload dialogue now offers a preview of your file, and for images, the option to compress it.</p>

<p><img src="https://img.blackquill.cc/tok%20upload%20dialogue%20desktop.png" alt="photo upload on desktop"></p>

<p><img src="https://img.blackquill.cc/verbos.png" alt="video upload on desktop"></p>

<p><img src="https://img.blackquill.cc/file%20upload%20tok%20desktop.png" alt="music upload on desktop"></p>

<p><img src="https://img.blackquill.cc/file%20upload%20on%20desktop.png" alt="file upload on desktop"></p>

<h2 id="sending-state">Sending State</h2>

<p><img src="https://img.blackquill.cc/send%20state.png" alt="send state"></p>

<p>The sending state of outgoing messages is now displayed with a little icon by the timestamp.</p>

<h2 id="adjusted-chat-list-look">Adjusted Chat List Look</h2>

<p><img src="https://img.blackquill.cc/adjusted%20chat%20list%20look.png" alt="adjusted chat list look"></p>

<p>The look of the chat list has been slightly adjusted, to make it look more like other KDE apps. More information is shown, such as the sending status of outgoing messages and the timestamp of the latest message.</p>

<h2 id="improved-pasting">Improved Pasting</h2>

<p>Tok now has improved pasting capabilities, able to paste from apps that put images on the clipboard directly like Firefox and Spectacle now.</p>

<h2 id="obtaining-tok">Obtaining Tok</h2>

<p>Tok can be built from source from <a href="https://invent.kde.org/network/tok" rel="nofollow">https://invent.kde.org/network/tok</a>.</p>

<p>There&#39;s a Telegram room for Tok available at <a href="https://t.me/kdetok" rel="nofollow">https://t.me/kdetok</a>, where you can come on and chat about anything Tok related, such as asking questions on using or building Tok.</p>

<h2 id="contributing">Contributing</h2>

<p>Interested in contributing? Come on by the dev chat and say hello!</p>

<p>Tags: <a href="https://blog.blackquill.cc/tag:libre" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">libre</span></a> <a href="https://blog.blackquill.cc/tag:tok" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">tok</span></a></p>
]]></content:encoded>
      <guid>https://blog.blackquill.cc/this-week-in-tok-days-of-work-seconds-of-experience</guid>
      <pubDate>Mon, 26 Jul 2021 01:07:48 +0000</pubDate>
    </item>
  </channel>
</rss>