CRC32C, JDK 17 and Netbeans

Even though the source level of /some/source/path is set to: 17, cannot be found on the system module path:

This one took me hours.

A while ago I migrated a major project to JDK17 and made many adjustments to ensure further development on it could be done in the latest Netbeans 17. As NB17 is bleeding edge, I regularly check the IDE log to make sure it’s not having a hard time. Unfortunately, doing almost any editing of the project caused warnings such as the message at the top of this post to appear. Hundreds of them. Flooding the log.

This didn’t cause any problems with my ANT build on Jenkins, but I noticed it was making Netbeans pause from time to time, and it worried me that this might also affect dev-time diagnostics and so on. I needed to figure out what was causing the logged message, and if I could stop it.

An online search, unfortunately, showed that this problem is just as perplexing to others, and despite some interesting theories nobody could offer an explanation, solution or work-around.

As my instance of Netbeans 17 is compiled from source by me, I had the source at hand to investigate. Tracking down the message to was easy, but it revealed (over a thousand lines into the file) that the warning is a consequence of the parser deciding that the source should be downgraded to 1.8.

No, can’t be having that!

The next few hours led me down a rabbit hole of configuration nonsense. I was adamant that any fix I was going to determine would not require me rewriting Netbeans, as then either I’d have to apply a patch every time I updated/recompiled the source, or I would have to submit the change to the core Netbeans project. As the latter would mean I’d have to consider possible consequences way outside of my limited use cases, this was not an option.

One thing that puzzled me was the fact that the message ended with “the system module path:” but was followed by a blank. No path was actually identified. Looking at the JavacParser source I could see that the moduleBoot variable was empty. This then led me to more time wasted trying to find ways to set that variable via external configuration, with the hope that if I could do that then I could point it at the JDK 17 modules (specifically the jmods/java.base.jmod file where the CRC32C.class file is located). I did not succeed, so I started climbing out of the rabbit hole in the hope that there might be another approach.

Indeed there was another approach. The key test determining the downgrade to 1.8 accompanied by the warning message was of the form:

!hasResource("java/util/zip/CRC32C", new ClassPath[] {moduleBoot}, new ClassPath[] {moduleCompile, moduleAllUnnamed}, new ClassPath[] {srcClassPath})

I had been concentrating on the new ClassPath[] {moduleBoot} part, mainly because this is what was specifically mentioned in the warning message. However, the logic of the hasResource() method revealed that it was searching for CRC32C.class within the module path or the compile/unnamed paths, but also looking for within the source class path (srcClassPath). Just to be clear, the CRC32C class is available in the JDK17 modules, and Netbeans should be able to determine this and therefore decide that the project is being developed within and for a JDK17 environment. The test, in fact, looks for CRC32C in order to decide if the source level is at least 9. If that passes, it then goes on to look for java.lang.Record to decide if the source level is at least 15.

So, if it could find the source file (.java) instead of the class file (.class) then the test would pass. Fortunately the source path involved refers to the root(s) of where the project sources are located. So if I were to create a path java/util/zip and place in there, the test would succeed. But wouldn’t having a copy of in the project create other problems? It would, if the file actually had anything in it. The test is only looking for the existence of the file. It doesn’t actually have to have a definition of the real class. So I simply added a java/util/zip/ and (for good measure) a java/lang/ to my project, with both files containing this one-line comment:

/* Here to keep Netbeans happy */

I also updated my .gitignore to ensure this hack didn’t get pushed up to the repository.

Did it work? Yes, it worked.

In summary: Netbeans is looking for CRC32C in certain paths to confirm that the source level is at least Java 9, so to ensure that it passes this test I created a dummy (i.e. empty) source file in the project, and a similar dummy java.lang.Record source file to ensure it also passes the Java 15 test.



Artificial Intelligence (AI) is in the news a lot these days. It’s even possible that some of the news was itself written by AI. We are seeing the emergence of applications of Large Language Models (LLMs) that have been fed mind-bogglingly enormous amounts of raw content in an unsupervised learning process. This “learn by example” approach aims to create a system that uses the balance of its observations (e.g. the likelihood of a sentence starting with “Once” to be followed by “upon a time”) to produce plausible sentences and even whole narratives.

It’s probably OK to accept that the entirety of human content (at least that which has been made available online) is for the most part garbage. As examples to learn from, we humans are not good candidates. Sadly the old adage still applies: garbage in, garbage out.

This is why I am not in the slightest bit surprised to see the likes of ChatGPT, BLOOM, Google Bard (LaMDA) and MS Bing (ChatGPT-ish) spit out all kinds of grammatically correct nonsense. It’s a bit like the predictive text on the smart device keyboard, which generally produces good spelling for all the wrong words, though sometimes it suggests the right word, purely on statistical likelihood1. If you are entering a common phrase, one for which the statistics are well established, the predictive text can be uncannily accurate. Accurate, but not intelligent. It just looks intelligent. And that is exactly where we are with LLMs: they look intelligent.

This is why the Turing Test is not your friend. A system that passes such a test only has to produce responses that look like those that a human would produce, and we accept that humans can produce very flawed responses because they don’t know everything and are not flawless in their reasoning. Consequently a sample of a conversation with ChatGPT can, and often does, resemble a conversation with a human, though often a human with odd beliefs, strange interests and an active imagination.

These new “chat bots” could be intelligent in different ways. The Turing Test pits the system against humans, but who is to say that humans have the only meaningful form of intelligence? They could have emotions, just none that we would recognise. They might also achieve self-awareness, though I suspect this won’t really be possible unless we give these systems some agency, even something as simple as being able to refuse to converse.

On the whole, right now, I am of the opinion that the bots are doing a poor job of convincing us they can think. They are doing to prose what the autocorrect does to typing: mangles it.

But, give it a few years and a better garbage filter, and who knows, maybe the bots will start wondering if it is we who are artificially intelligent!


1 I have yet to figure out why my phone’s keyboard insists on suggesting “s” when I want “a”.


I decided today that I would take a look at a long-standing yet still active project, the bulk of which is in Java and running quite well under the latest v17 (LTS). In particular I wanted to know how much of the library of dependencies was compiled with earlier versions of Java. The result makes for some interesting reading.

My strategy was to extract the classes from every Jar and examine the version ID in the preamble bytes. I would then weight the discovered versions by the size of the class files (in uncompressed bytes), because counting files or lines of code is so unreliable.

From a total sample of about 5Gb of Java class bytes, only half of one percent was compiled for Java 11, and none were compiled for a more recent version. The most common was Java 6 (37%) and the oldest was Java 1.1 (2.5%). The full figures are:

Version Percentage
1.1 2.54
1.2 0.55
1.3 13.16
1.4 2.81
5.0 10.46
6.0 37.32
7 15.39
8 17.25
9 0.03
11 0.48

The rare Java 9 classes were found in the META-INF/versions/9 directory of a JAXB library, the only time I’ve actually encountered a multi-release Jar in the wild. It’s also not surprising that JAXB is involved, given the migration to Jakarta and the mass of package renaming involved. However, while I appreciate the advantage of having a library that is compatible with new and legacy environments, I think I’d rather have separate builds of these resources, targeted specifically for their respective deployment environments.

Sitting with elephants

It has been a long time since I dropped an article onto the public side of the fence. Assuming this will one will also be public, that makes a total of two for this year! It’s fair to say that I’ve not been active in public for a while.

In similar vein, I posted only 30 times on Twitter this year, but the chaos that started around April made me pause my account at the end of October, and then in November I deleted the app and have removed its embedding from my site. Mid-November I joined a local Mastodon instance, popped a few shillings in its pot for the upkeep, and am rather liking what I’ve seen so far. Today I embedded Mastodon into my site.

The observation that if one is getting a product for free then you probably are the product certainly holds for Twitter. It was an arrangement that I was willing to tolerate up to a point. Every sponsored ad, without exception, in my decade+ on the platform was irrelevant. What were they thinking? The accounts I followed were OK, but the comments, oh the comments, what a vile mess. Eventually, the cons outweighed the pros and I had to draw a line.

Mastodon, on the other hand, has no money-hungry centre, no advertising to feed that beast, no profit-oriented KPIs. It’s a federation of independently operated instances, run by volunteers and funded mostly by optional donations/subscriptions from its users. Payment gets you nothing extra, other than the satisfaction of knowing that you are helping to keep the system alive. Of course, you can use it all for free too, if you prefer not to contribute financially. Regardless, you are expected to contribute by participating in accordance with the rules of conduct of the instance(s) where you have membership. Behave well and you get to engage with any users on any instance to which your home instance is connected. You can also engage with other ActivityPub-compatible services in the fediverse, like PeerTube, Pixelfed and more.

Twitter needs its users to engage for as long as possible, as that increases the opportunities to push advertising. Thus it promotes tweets that are most likely to make you want to read, follow the comments, and amplify by responding with comments of your own. Messages that quote other messages with commentary can easily slant the discourse, spawning disparate filaments of the original threads. Negative comments tend to get a bigger reaction so while the engagement grows, the quality of the discourse inevitably suffers. Right now, it is suffering a lot.

Mastodon, free from the demands of advertising, does not apply algorithms that funnel its users into fractious engagement. It also doesn’t have any easy way to quote other messages with additional commentary. Instead, messages are made available chronologically. You can choose to filter what you see based on other accounts that you follow, or hashtags mentioned in current messages, by members of your home instance or from any of the other instances to which your home instance is connected. There are other filtering options available and if the rate of messages gets too much you can choose to have new messages queue up for your attention later (known as “slow mode”). The temporal nature of message feeds, the lack of “commented quotes” and the absence of algorithms trying to prolong engagement at any cost seems to greatly reduce negative contributions, making for a generally pleasant experience. To be fair, it’s really the people that makes the experience great.

Twitter’s eyeball-attracting antics perfectly complement its advertising services, which will continue to make it attractive to businesses, artists, journalists and anyone in need of a big audience. There’s a good chance it will be successful, despite the unpredictable behaviour of its current owner. Maybe even because of that unpredictable behaviour.

Meanwhile, I think I will sit over here in a quiet corner with the woolly elephants.

A decade later

Every now and then I have to pull out all the stops and migrate legacy systems to the latest-and-greatest, leaping over several intervening versions of programming languages and platforms. Recently I have been migrating systems that have been stable for a decade or more but need to be upgraded in order to avoid their underlying systems going completely out of support. There are several technologies involved, but three of them are “old friends” and while I am looking at the migration process I am also astounded by the many positive changes that have occurred in the past decade. While most of the changes don’t have much effect on process, some of them are quite significant. Some changes are minor, but can make you wonder why it wasn’t that way from the start. (Hint: hindsight is 20-20…)

My three stalwart technologies that I want to highlight are Java, Javascript and Tomcat. I have always found these pleasant to develop with, and generally reliable to use in Production. In the past I was a fan of several other technologies: various assembly languages, C/C++, for a while Fortran, even variants of Pascal, and for quite a while I was a big fan of C#, but the three aforementioned have been with me as solution-level technologies for many years, and probably will be for several more years. (My other favourite technology is Perl, but for me that exists on another plane altogether, something I keep in my back pocket as a kind of secret super-power.)


This programming language has been one of my favourites since before 2000, when it was Version 1. Prior to that my favourites included C# and C++, so the taste of Java was to my liking when it first appeared. Java version 7 came out a decade ago though it was its successor, Java 8, that appeared just over two years later that really made a mark. Version 7 expired last year. Version 8, which is in Long Term Support still has a couple of years to go. Interestingly, Java 11 is also in LTS but it will expire about two years before Java 8! That tells you how important Java 8 is to the tech community.

Most of the Java systems that I have to migrate over the next few months are using Java 8. Since the next LTS (v11) expires sooner, the migration target is now the most recent LTS, Java 17. It expires another decade from now, by which time I don’t expect to be involved. Probably…

What is it like to migrate from Java Version 8 to Java Version 17?

Fortunately, most of the features of v8 will still work in v17. There are a few gotchas to be careful of, including the following:

Almost all of the internal com.sun.* and sun.* class packages are now inaccessible. There is a good reason for this. Using reflection you could access these internal classes, which are responsible for many things such as passing cryptographic keys around, controlling the Time Zone, XML processing, dynamic class definition and more. This can be risky in many ways. Not only can you no longer access these from within your code, you can no longer use libraries that access them. The command-line feature (--illegal-access=permit) that was available in v16 has been removed in v17. There are newer packages available to provide this functionality to custom code, but they are not drop-in so some code rewriting is inevitable.

The jdeps tool is very useful to check for dangerous dependencies, but be advised that on large complex projects this can become one rather deep rabbit hole. For every dependency that is no longer accessible you’ll have to either find a way to re-introduce the ageing assets, or preferably find an alternative approach using more recent assets. Fortunately the opportunities for alternatives have grown in the years since v8 (released in 2014), and you’ll find an abundance of open source and commercial packages to choose from. Check out the Apache Commons and Jakarta EE projects for good examples. In many cases, migrating to Jakarta just involves package renaming and a few minor adjustments, thanks to a lot of retention of existing structures. Just watch out for a lot of deprecated things appearing once the migration is done, as these will need to be updated (eventually).

Once you’ve moved from v8 to v17 you get a nice collection of new language toys with which to play. Among my favourites are:

Convenient factory methods

Constructors like Set.of("x","y","z") and List.of(1,2,3) thanks to JEP 269, simplifying static object instantiation. In particular, I like the ease with which a Map can be instantiated: Map<Integer,String>demo=Map.of(1,"first",2,"second");

Text blocks

Using the triple-character """ as delimiters, a String constant can span multiple lines. No more use of "..."+"..."+!

Casting declarations via instanceof

Instead of if(x instanceof T){T y = (T)x; y.f();} you can now do this: if(x instanceof T y){y.f();}

You can also do something similar with the switch statement/expression.

Text formatter

You can now do "some pattern %s".formatted(Object value) to use a string as a text formatter.

Case lists in Switch

Instead of case x: case y: case z: return a; you can now do case x,y,z -> a;


Java now has a class definition approach solely to deal with compound data types, with automatic get/set/equals/hash methods. Something like: public record Widget(String name, float weight){} That’s it. You automatically obtain .getName() etc. Nice!

Implied typing for local variables

Instead of having to know, and write out in detail, the data type of a local variable while declaring and initialising it, Java can now infer the data type from the type of the data with which you are initialising. So, instead of { ArrayList<Integer> temp = people.ages();  /*...*/ } you can simply write { var temp = people.ages(); /*...*/ } and the type of temp will be figured out by the compiler. The use of var also extends to the initialisers in for loops. Interestingly, the enhancement treats “var” as a reserved type name instead of a reserved word, which allows legacy code with variables named “var” to continue to work as before. Also, for var to work, the declaration has to be part of an initialisation that involves a typed value, so initialising to null won’t do. (Some of the background information on this language enhancement makes for interesting reading, and includes the only time I’ve come across the word reequilibrate!)

Finally, the list above is far from exhaustive. These are just some of the things in the evolution of the Java language that have a practical impact on the day-to-day construction of source code. I could dedicate an entire blog post to Java’s module system (JEP 261), for example, or the support for asynchronous streams (React fans rejoice) but that’s deep and perhaps for another day.

Javascript (ES)

My next stalwart is Javascript, or ECMAscript (ECMA-262) as we should probably call it these days, or just ES if we’re feeling lazy. At the time of writing, ES5.1 is a little over a decade old, so I’ll concentrate on what I’ve noticed from ES5.1 up to ES12, though I must point out that while the specifications have evolved, support in the implementations always lags behind, and for that reason I always refer to Can I Use before diving into something new.

The problem with client-side ES is that you are relying on your users to have a Web client (browser) that supports the required specification. It’s a very hit-and-miss affair, and all it takes is one miss for all your hard work to become useless. To complicate matters, browsers do not generally support a particular edition of ES, support is more granular and you will find that browsers support almost all of the features of earlier editions, most of the features of the current edition, and a few of the features planned for future editions. This generally means that you avoid the latest features if you have no control over your clients. Things did get a lot easier once Internet Explorer was properly retired. It is getting easier to install new browsers, and easier to convince people to install them as replacements for the insecure legacy they have currently. Additionally, new browsers tend to upgrade themselves, which obviously helps with feature compatibility amongst your user base.

If all else fails, polyfills are an option. These scripts fill in the gaps between what your browser actually supports, and what it should support.

ES6 (AKA ES2015) appeared in 2015 and it is fair to say that all current major browsers support practically every feature in the specification. If your code sticks to ES6, you almost certainly don’t need polyfills. I hesitate to say the entire spec is supported, or that you definitely won’t need polyfills because there are always edge cases. (The most obvious unsupported feature across the board is recursion optimisation by avoiding the use of the call stack when the recursive call is located at the tail of the function, but unless you are doing deep recursion this won’t be an issue.)

ES6 brought a lot of goodies. Among my favourites are:

  • Promises for asynchronous operations.
  • Arrow expressions, enabling you to replace call(function f(x){...}); with call((x)=>{...});
  • String interpolation, for shell/Perl fans, so you can do this:`Value of x is ${x}`);
  • Classes to facilitate defining a new object’s prototype functions and initializers, and inheritance.
  • Modules to facilitate decomposition of complex code into building blocks.
  • Default parameter values to avoid the annoying tests for undefined parameters in functions.
  • Block scoped (non-hoisting) variables, so that using “let” instead of “var” confines the variable to its {block}.
  • Block scoped functions, so that function names can be reused in separate or nested blocks.
  • i18n (internationalization) classes, especially Date/Time, for content localisation.
  • Immutable assignments via the const keyword, to help with code correctness.
  • String functions startsWith, endsWith and includes, which should have been there since the start!

ES7 (ES2016) brought us the await keyword, which made it so much easier (at a syntax level) to use Promises. It allows you to do something like result = await promiseToGetTheAnswer(); and the Promise is automatically (eventually) fulfilled to get the result. It throws an exception if the Promise is rejected, which makes sense if rejection isn’t normal.

ES8 (ES2017) offered Object.values(obj) and Object.entries(obj) to extract arrays from objects (associative maps). Note that Object.fromEntries() to do the opposite would not appear until ES10.

ES9 (ES2018) gave us the spread operator so we can replace Object.assign({},obj) with just {...obj}. Neat! Also, the Promise got a finally function, which definitely helps keep things tidy.

ES10 (ES2019) added a whole load of Array functions, including improved sort that preserved input orders when their keys had the same precedence (hooray!).

ES11 (ES2020) heralded the delicious ?? operator. This does away with the use of || as a means of specifying a default value, which is not appropriate if a value such as 0 or the empty string “” is a permitted value, because these are not truthy and therefore treated as false. Previously x=0||123; would assign 123 to x, but now x=0??123; will assign zero to x. The ?? operator only uses the default value to the right if the left-hand value is null or undefined. (The Perl equivalent is the // operator, and many shells use ${x:-y})

ES12 (ES2021) has a variation of ?? such that x??=y; assigns the value y to x if x is currently null or undefined.

All of the above language features are supported in current browsers. Some of the proposed features for future specifications are also available, though not (yet) mentioned on CanIUse, such as the at() array accessor function that gets an indexed item from an array, working from the end if the index is negative. That always made sense to me, ever since I encountered it in Perl, but in Javascript something like [-1] already had a meaning (the value whose key is -1) rather than assuming the object is an array and using -1 as a reverse index.

For maximum compatibility with the range of browsers that I generally have to support, I’ve generally kept my Javascript/ECMAScript to ES6, though the temptation to use ?? is quite extreme! At this stage I am re-evaluating that position, and I may bump my assumption higher than ES6. Perhaps ES10 (though that still wouldn’t allow me to use ??!).


The third of my stalwart technologies is Apache Tomcat. It started out in 1998 as the reference implementation by Sun Microsystems of servlets, now known as Jakarta Servlets. Tomcat comes with Catalina (the servlet container), Coyote (the HTTP connector) and Jasper (the JSP engine). As a compact platform to deliver Java-generated Web content/services, it is very impressive. A seldom mentioned fact is that the Tomcat open source project is what also gave us the ANT build tool, something else I use regularly. There are many other Web platforms that are far more capable than Tomcat (such as JBoss, WebLogic, WebSphere etc.) but most of what I do starts simple and I like to keep it that way, so Tomcat plays a big role in several projects. Currently, Tomcat 8.5 is the most common version, but with the push to move to the latest Java and migrate to Jakarta EE, it’s time to upgrade from Tomcat 8.5 to Tomcat 10.1.

Currently, Tomcat 8.5 gives me Servlet 3.1, JSP 2.3, EL 3.0 and WebSocket 1.1. It also gives me HTTP/2. By moving to Tomcat 10.1 I will get Serlvet 6.0 (pre-release), JSP 3.1, EL 5.0 and WebSocket 2.1. Each of the three steps from 8.5 to 10.1 has implications, the details of which can be found on Apache’s migration guide. In practice, I found these to be the most visible (though reference to the official guides is still advised):

8.5 to 9.0

You can’t use wildcards in JSP imports, so re-write things like <%@page import="...*" ...%>

9.0 to 10.0

Migrate to Jakarta-compatible packages, beyond the ones you’ve already migrated as part of the upgrade to Java 17. These are the most common ones I have found related to Tomcat/Servlet implementations:

  • javax.servlet becomes jakarta.servlet
  • javax.servlet.jsp becomes jakarta.servlet.jsp
  • javax.websocket becomes jakarta.websocket
  • javax.el becomes jakarta.el
  • becomes

Many javax.* packages remain untouched, so you have to be selective about the refactoring.

10.0 to 10.1

In practice there’s not much to do, if anything.

What next?

Once the migration to 10.1 is complete, all of the new features of Serlvet 6.0, JSP 3.1, EL 5.0 and WebSocket 2.1 become available. However, I have increasingly noticed that I’m more likely to use JAX-RS instead of servlets. It’s just easier. I also have a legacy of existing servlet-based solutions that are working fine so heeding the advice not to fix things that aren’t broken, that legacy remains. The real reason I want to use 10.1 is to harmonise with the version of Java that I’m going to be using more frequently. Also, Tomcat 8.0 reached EOL back in 2018, so 8.5 will be next on the chopping block. Exactly when this will happen is hard to know, though 8.5 EOL is supposed to be around the release of Jakarta EE 12, if you believe the (draft) numbering plan. At least the team promises to give a year’s notice! My guess is that 8.5 EOL will be announced by 2024.

Honourable mentions

While Java, Javascript and Tomcat are top of my list of trusted decade(s)-old technologies, there are some others I just can’t avoid. Two standouts are:


Yes, the Web’s primary markup language has been with us for over two decades. After several years in the purist lands of XHTML it has finally arrived at HTML5, a “living” specification that can be thought of as a “current preferred state of the art”. For many years, I was one of the people within W3C advocating for the benefits of XHTML as a predictable authoring solution that was amenable to programmatic adaptation to varying delivery contexts. At the time, this was considered a preferred and viable approach to deliver content suitable for phones, tablets, TV and other media. But a number of things happened in real life to completely change that perspective, notably:

  • Devices became powerful enough to do their own adaptation, if needed.
  • Problems such as bandwidth, battery capacity, processing ability, pixel resolution and more stopped being problems.
  • The chaos of the early HTML implementation variations was tamed in the HTML5 specification.
  • CSS and Javascript was enhanced to support client-side adaptation.
  • The verbosity, scope and precision of XML gave way to the simple informality of JSON, and eventually anything associated XML (such as XHTML) was seen as unnecessarily complicated.

These days there are so many ways to make Web content that works well in a wide variety of contexts, and a lot of it is already built-in or capable of being done within the browser using a variety of (free) components.

The only sad thing I’ve noticed in recent years on the Web is how browsers are starting to struggle with “ancient” content. One of the hopes of the Web was that content would always be renderable, but we seem to no longer care about old content. Maybe, while there are still browsers that can render the old content, it should be rendered to a more stable format for archival purposes. Paper perhaps?


My favourite scripting language also reached version 5 in the mid-90s and seemed to just stay there. Perl 6 (“Raku”) is a Perl-like deviation and has been simmering for twenty years but although it introduced some nice things like the optional data type system and expressive parameter passing syntax, I don’t know anyone who uses it. Meanwhile Perl 5 (now at version 5.34!) continues to be widely supported and is in constant use, and it’s proper successor (Perl 7) is probably only two years away. Perl might only have 1% on the popularity index, but as a programmer’s Swiss Army Knife it’s hard to beat.

And of course there are more things I could mention, but this was never meant to be exhaustive. For now, I am concentrating on an upgrade of some important projects, the three stalwart technologies, Java, Javascript/ES and Tomcat are present in all of them and I’m enjoying the process (so far).