It's not uncommon on HN! We frequently have people chiming in as CEOs, insiders, and experts in various fields without much proof. Generally, it hasn't been a problem. Or at least I've not seen any examples of having wool pulled over our eyes in this fashion.
I tried Windsurf for the first time last week, and I had pretty mixed results. On the positive side, sometimes the tool would figure out exactly what I was doing, and it truly helped me. This was especially the case when when performing a repetitive action, like renaming something, or making the same change across multiple related files.
Most of the time though, it just got in the way. I'd press tab to indent a line, and it'd instead jump half-way down the file to delete some random code instead. On more than one occasion I'd by typing happily, and I'd see it had gone off and completely mangled some unrelated section without my noticing. I felt like I needed to be extremely attentive when reviewing commits to make sure nothing was astray.
Most of its suggestions seemed hyper-fixated on changing my indent levels, adding braces where they weren't supposed to go, or deleting random comments. I also found it broke common shortcuts, like tab (as above), and ctrl+delete.
The editor experience also felt visually very noisy. It was constantly popping up overlays, highlighting things, and generally distracting me while I was trying to write code. I really wished for a "please shut up" button.
The chat feature also seemed iffy. It was actually able to identify one bug for me, though many of the times I'd ask it to investigate something, it'd get stuck scanning through files endlessly until it just terminated the task with no output. I was using the unlimited GPT-4.1 model, so maybe I needed to switch to a model with more context length? I would have expected some kind of error, at least.
So I don't know. Is anyone else having this experience with Windsurf? Am I just "holding it wrong"? I see people being pretty impressed with this and Cursor, but it hasn't clicked for me yet. How do you get it to behave right?
I find Cursor annoying too, with dumb suggestions getting in the way of my attempts to tab-indent. They should make shift-tab the default way to accept its suggestion, instead of tab, or at least let shift-tab indent without accepting anything if they really want to keep tab as default autocomplete.
I find it's very model-dependent. You would think that the more powerful models would work the best, but that hasn't been the case for me. Claude Sonnet tends to do the best job of understanding my intent and not screwing things up.
I've also found that test-driven development is even more critical for these tools than for human devs. Fortunately, it's also far less of a chore.
For renaming things: I have much more confidence in "traditional" IDE intellisense than LLMs, since they know exactly which ones to change, and never hallucinate and change irrelevant ones (worst cases are variables that have the same name, but point to different variables aka different addresses due to scoping/different functions).
I've recently been learning about how fonts render based on subpixel layouts in monitor panels. Windows assumes that all panels use RGB layout, and their ClearType software will render fonts with that assumption in mind. Unfortunately, this leads to visible text fringing on new display types, like the alternative stripe pattern used on WOLED monitors, or the triangular pattern used on QD-OLED.
Some third-party tools exist to tweak how ClearType works, like MacType[1] or Better ClearType Tuner[2]. Unfortunately, these tools don't work in Chrome/electron, which seems to implement its own font rendering. Reading this, I guess that's through FreeType.
I hope that as new panel technologies start becoming more prevalent, that somebody takes the initiative to help define a standard for communicating subpixel layouts from displays to the graphics layer, which text (or graphics) rendering engines can then make use of to improve type hinting. I do see some efforts in that area from Blur Busters[3] (the UFO Test guy), but still not much recognition from vendors.
Note I'm still learning about this topic, so please let me know if I'm mistaken about any points here.
I'm pretty sure windows dropped subpixel anti-aliasing a few years ago. When it did exist there was a wizard to determine and set the subpixel layout.
Personally I don't bother anymore anyway since I have a HiDPI display (about 200dpi, 4K@24"). I think that's a better solution, simply have enough pixels to look smooth. It's what phones do too of course.
To be clear: Windows still does subpixel rendering, and the wizard is still there. The wizard has not actually worked properly for at least a decade at this point, and subpixel rendering is always enabled, unless you use hacks or third-party programs.
DirectWrite doesn’t apply subpixel anti-aliasing by default [0], and an increasing number of applications use it, including Microsoft Office since 2013. One reason is the tablet mode starting with Windows 8, because subpixel ClearType only works in one orientation. Nowadays non-uniform subpixel layouts like OLED panels use are another reason.
macOS dropped it a few years ago, primarily because there are no Macs with non-HiDPI displays any more (reducing benefit of subpixel AA) and to improve uniformity with iOS apps running on macOS via Catalyst (iOS has never supported subpixel AA, since it doesn’t play nice with frequently adjusted orientations).
Windows I believe still uses RGB subpixel AA, because OLED monitor users still need to tweak ClearType settings to make text not look bad.
> because there are no Macs with non-HiDPI displays any more
That is not true. Apple still sells Macs that don't come with a screen, namely Mac Mini, Mac Studio, and Mac Pro. People use these with non-HiDPI monitors they already own all the time.
That's not really the most charitable reading of GP's comment. I think they very clearly mean that Apple does not sell Macs with non-HiDPI displays anymore. It's not a configuration they sell, so they don't need to support those features anymore in their current offerings.
You're right that there's nothing stopping someone from hooking up an HDMI-to-VGA adapter for their 22" Trinitron from 2001, but that doesn't mean boat anchors are a meaningful market segment. It's not a consideration for why they should retain a font rendering feature for a modern OS. You're just going to have to suffer with fuzzy fonts for your retrogaming hobby.
So what is the "configuration they sell" for the desktop Macs? The Studio Display that costs way too much for what it is, so to no one's surprise, they're not selling all that many of those? Or the Pro Display XDR for which the stand alone costs more than an entry-level Mac Mini? Sure no one will buy a $1600 monitor to use with their $600 Mac Mini. They'll get a much cheaper third-party 2K one.
Apple put all their chips behind Retina/HiDPI displays. To that end, they've got really good HiDPI resolution scaling, they no longer sell displays incapable of Retina features (in laptops or stand-alone), and they have removed features that only serve to support sub-4k displays. To Apple, 4k is the minimum standard.
If you want a 2k monitor you can buy one and hook it up, but Apple isn't interested in making it look good. It's a not new decision, either. They stopped selling Macbooks without Retina displays in 2016. They haven't supported 2k scaling since the M1 Mac Mini over 5 years ago: https://www.macworld.com/article/549493/how-to-m1-mac-1440p-...
Apple is not a budget vendor. They're a premium vendor. That's not just what other people call them. It's what they themselves profess to be. That's why you can get an Apple Thunderbolt cable for $70. To Apple, if you buy a Mac Mini, yes they're expecting you to hook it up to a 4k monitor. They expect you to be getting a Mac Mini because you want a Mac, not because you can't afford a Macbook.
Well, in my particular case, I use an M1 Max MBP with a 2K monitor that I already had when I bought the MacBook.
The problem with 27" 4K monitors is that you can't have integer scaling on them. If you set the scaling factor to 1x, everything will be too small, if you set it to 2x, everything will be huge, and macOS can't properly do fractional scaling because what it actually does is render everything into a 2x framebuffer and downscale that for output.
And besides, only supporting HiDPI displays doesn't mean one can stop striving for pixel perfection. I hate SF Symbols icons because they're sizeless. They're an abhorrent blurry mess on my monitor but they're also not all that sharp on the MacBook screen. If you notice it once, it'll haunt you forever. Sorry. They do look fine-ish on iPhones though because those use OLED displays that lack the notion of a pixel grid anyway.
> Why don’t they produce 5K/6K monitors that allow for 2x integer scaling?
Because 5K panels are probably more expensive to produce than 4K ones, and because that would only benefit Mac users since Windows can do fractional scaling just fine. I'm not sure about that but it might also be that not all GPUs used in PCs can drive monitors larger than 4K.
Even if Windows/Linux do fractional scaling fine, integer scaling is still desirable if it’s an option. Under both I still run into programs that botch fractional scaling some way or another, and given the proclivity of programs on both platforms to be built with oddball UI toolkits I don’t expect that to ever really fully resolve itself.
It’s one of the chief complaints I have with one of my mostly otherwise good x86 laptops. The 1.5x scaling the display needs has been a pain point on multiple occasions.
Since you are so sure about how Mac Mini's are used, is it 2k on 24" or 27" that these customers use?
My impressions based on limited anecdotal data I've is that most people with mac mini are using it as their secondary device (everyone has a Macbooks). Everyone is using 27" 4k monitors. 4k monitors are not that far from 2k monitors, and I think most people who are preferring to buy 2k are gamers that want higher refresh rate that their GPU can support at 2k. But gamers are not using Mac's anyway.
Viewing distance matters. ppi isn’t the target metric, it’s pixels-per-degree-of-vision that determines if a display setup is “retina”. 60 ppd is equal to 20/20 vision in a human.
My 34” monitor is only 4K but is “retina” at the viewing distance in my home office according to this calculator:
https://qasimk.io/screen-ppd/
They don't sell it as part of the configuration options.
You can separately purchase whatever monitor you wish. There are now plenty of 27" 5K monitors out there. Asus, LG (for now), Viewsonic, Kuycon, others I'm probably forgetting. They're expensive as far as monitors go, but not as expensive as the Studio Display.
Sure, but they’re not going to optimize for that case because the bulk of their Mac sales are tied up in their laptops and a significant chunk (I’d hazard a guess over 50%) of people buying Studios/Pros especially but also Minis are pairing them with Studio Displays, Pro Display XDRs, or more recently the various third-party 2x HiDPI display offerings from the likes of Asus, BenQ, Dell, and Samsung.
They’re fine to my eye, at least as good as well tuned freetype (as found on Ubuntu) as long as you’re either using a 2x HiDPI display or are using a “normal” DPI monitor with above average density (e.g. 2560x1440 27”) and have subpixel AA forced on.
Where it falls apart is at densities any lower, for example it struggles on those awful 1366x768 15.6” panels that it seemed like every other laptop was built with for a while. Similarly 24” 1080p and 32” 2560x1440 are pretty bad.
It absolutely still does subpixel AA. Take a screenshot of any text and zoom way in, there's red and blue fringing. And the ClearType text tuner is still a functional builtin program in Win11 24H2.
I still have subpixel antialiasing on when using a 28" 4K display. It's the same DPI as a FHD 14" display, typical on laptops. Subpixel AA makes small fonts look significantly more readable.
But it only applies to Linux, where the small fonts can be made look crisp this way. Windows AA is worse, small fonts are a bit more blurred on the same screen, and amcOS is the worst: connecting a 24" FHD screen to an MBP ives really horrible font rendering, unless you make fonts really large. I suppose it's because macOS does not do subpxel AA at all, and assumes high DPI screens only.
As far as I'm aware, ClearType is still enabled by default in Windows.
Subpixel text rendering was removed from MacOS some time ago, though, presumably because they decided it was not needed on retina screens. Maybe you're thinking of that?
The standard is EDID-DDDB, and subpixel layout is a major part of that specification. However I believe display manufacturers are dropping the ball here.
For me, being old time user, (ab)using any subpixel layouts for text rendering and antialiasing is counterproductive and (especially with current pixel densities, but also in general) introduces much more issues that it actually ever solved
“Whole pixel/grayscale antialiasing” should be enough and then specialized display controller would handle the rest
Agreed, but layouts such as Pentile don't actually have all three subpixel components in a logical pixel, so you'll still get artifacts even with grayscale AA. You can compensate for this by masking those missing components.
surprising info, I thought this was supposed to be the part about "display controller taking care of any additional issues", thanks for link with details, will read it with interest
I may be totally off the mark here, but my understanding is that the alternative pixel arrangements found in current WOLED and QD-OLED monitors are suboptimal in various ways (despite the otherwise excellent qualities of these displays) and so panel manufacturers are working towards OLED panels built with traditional RGB subpixel arrangements that don’t forfeit the benefits of current WOLED and QD-OLED tech.
That being the case, it may end up being that in the near future, alternative arrangements end up being abandoned and become one of the many quirky “stepping stone” technologies that litter display technology history. While it’s still a good idea to support them better in software, that might put into context why there hasn’t been more efforts put into doing so.
Sub-pixel anti-aliasing requires outputing a pixel-perfect image to the screen, which is a challenge when you're also doing rendering on the GPU. You generally can't rely on any non-trivial part of the standard 3D-rendering pipeline (except for simple blitting/compositing) and have to use the GPU's compute stack instead to address those requirements. This adds quite a bit of complexity.
I do think it's strange that using a wrapper like var() wasn't required here. Like func() or such. I'd actually rather that var() hadn't existed to begin with, but it should be consistent, at least.
There was Google Contributor[1], which they tried three launching three different times before giving up on it. It seems like nobody was interested in paying to remove ads on the web.
Reading your link, it basically launched twice, with the second time you couldn't even pay to remove adverts but you could bid as your browsed to drive up the price of advertising.
So you paid, and you still saw adverts, it was never available outside of the USA, and didn't last very long for each launch.
Not that people weren't interested. However this is the first I've heard of it, so there's that too.
I used this same approach in a recent web app and it worked great. You can also use scrollbar-gutter: stable, which disables scrolling but maintains the preserved space to avoid content reflows.
Yep, as a user I didn't find it confusing at all. F-Droid is designed for and around adding custom repos. FUTO links to their own repo and it all works fine.
I'd definitely consider this as being "available on F-Droid".
reply