Dajbych.net


A few notes on Sonar

, 6 minutes to read

A val­ida­tor is a great tool that checks your site for in­vis­i­ble de­fects. There are many of them but the Sonar is dif­fer­ent than all others. It is the only test­ing tool which wants to val­i­date web­sites com­plexly, is open source and com­mu­nity driven and has a browser in­te­gra­tion. Why fron­tend de­vel­op­ers need a com­plex test­ing tool? Why the in­de­pen­dence from any big soft­ware ven­dor is im­por­tant? And fi­nally – does Sonar have some po­ten­tial and is it a use­ful tool now? Well, I have col­lected some sig­nif­i­cant notes to its qual­ity.

What validators are good for?

I’m in a web de­vel­op­ment for more than 17 years. I like to watch how the World Wide Web is chang­ing over time. Some modern ap­proaches be­come stan­dardized while others fall into obliv­ion. I met very good (HTML5 by Ian Hick­son), very bad (RSS by Dave Winer) and very con­tro­ver­sial (CSS Level 1 by W3 Con­sor­tium) spec­i­fi­ca­tions. Some­times the spec­i­fi­ca­tion is wrong, other times the im­ple­men­ta­tion is in­cor­rect. Val­ida­tors are help­ful in a pro­cess of maximal­iza­tion web com­pat­i­bil­ity but they also can make you 100 % com­pat­i­ble with one soft­ware com­pany for the price of break­ing com­pat­i­bil­ity with dozens of other com­pa­nies. There are ba­si­cally two op­tions. First, use as much val­ida­tors as pos­si­ble and it­er­ate to as min­i­mal amount of er­rors as pos­si­ble or sec­ond, use one uni­ver­sal ven­dor in­de­pen­dent val­ida­tor which knows the best pos­si­ble tech­niques for you.

I cur­rently use these val­ida­tors:

Let’s look on Sonar

Both sonar and modern.IE are made by IE/Edge dev team. Why modern.IE was dis­con­tin­ued and sonar does not in­clude val­i­da­tion of Browser con­fig­u­ra­tion schema?

I don’t think that a web page served with the Con­tent-Type: ap­pli­ca­tion/xhtml+xml HTTP header must con­tain the HTTP charset pa­ram­e­ter be­cause the en­cod­ing of the XML doc­u­ment is UTF-8 by de­fault.

I agree that the CSS style served with the Con­tent-Type: text/css HTTP header should have charset=utf-8 pa­ram­e­ter but why this is not al­ready fixed in ASP.NET Core 2?

The rec­om­men­da­tion about IE doc­u­ment modes is very funny. First, if the page is served as ap­pli­ca­tion/xhtml+xml IE 9 and newer will use a XML parser and force the high­est avai­l­able stan­dards mode. Sec­ond, even if I send con­tent type as text/html I re­ally don’t need to use the X-UA-Com­pat­i­ble header or meta tag to avoid com­pat­i­bil­ity mode be­cause if the page con­tains the HTML5 Doc­type IE 6 and newer will au­to­mat­i­cally use the high­est avai­l­able stan­dards mode (in case of IE 6 only when the XML pro­log is omit­ted). Sonar should check this first be­fore it starts spread­ing alarm mes­sages.

It is in­ter­est­ing that dur­ing Mi­crosoft Edge Web Sum­mit the rec­om­mended con­tent type for EC­MAScript files was ap­pli­ca­tion/javascript and now the rec­om­mended con­tent type is text/javascript. How­ever, RFC 4329 from 2006 rec­om­mends ap­pli­ca­tion/javascript. Sonar should at least note which browsers are still not com­pat­i­ble with this stan­dard. By the way, ASP.NET Core 2 serves files with an .js ex­ten­sion as ap­pli­ca­tion/javascript.

I don’t think that the X-Con­tent-Type-Options HTTP header must be spec­i­fied when the server sends static files which are not user gen­er­ated.

I don’t think that the Strict-Trans­port-Se­cu­rity HTTP header must be spec­i­fied when the server sends styles, scripts or im­ages. The server redi­rects user from HTTP to HTTP+TLS and sets HSTS header dur­ing the first re­quest. Styles, scripts and im­ages are down­loaded af­ter­wards so the HSTS header has no longer in ef­fect. The si­t­u­a­tion is dif­fer­ent when these re­sources are loaded from an­other do­main. Sonar should dis­tin­guish these two cases and show an er­ror only when the do­main is dif­fer­ent.

Why CSS or EC­MAScript files should have the charset=utf-8 pa­ram­e­ter in the Con­tent-Type header? Let’s sup­pose that the agent ig­nores BOM or the server has an UTF-8 en­coded file without BOM. It is an is­sue only when the file con­tains non-ASCII char­ac­ters. Sonar should take this into ac­count.

Sonar will not show a warn­ing when the site uses con­tent com­pres­sion on an en­crypted con­nec­tion when chun­ked trans­fer encod­ing isn’t used. These 3 con­di­tions make the site vul­n­er­a­ble to the BREACH at­tack.

I agree that Server and X-Pow­ered-By headers are un­nec­es­sary and could cause a se­cu­rity risk but why Azure Web Apps are con­fig­ured to send them and why the de­fault tem­plate for ASP.NET (Core) apps doesn’t re­move them in a web.con­fig file?

I don’t think that serv­ing an im­age file which can be 44% smaller is an er­ror. First, Sonar should de­tect this as a warn­ing. Sec­ond, why the Sys­tem.Draw­ing.Bitmap en­codes im­ages into PNG for­mat with this ter­ri­ble ef­fec­tiv­ity?

Conclusion

Sonar is an am­bi­tious project which have a sim­i­lar po­ten­tial like Light­house. On the or­der side, Sonar seems to be in early stage of de­vel­op­ment. It needs to be a first ci­t­i­zen in F12 Dev Tools be­cause Google can as­sert its in­ter­pre­ta­tion of stan­dards via Light­house in Chrome De­v­Tools. Fi­nally, it should be adopted by Bing team which has a deep knowledge of the web and can im­prove Sonar’s qual­ity to sur­pass Light­house.