ads

Saturday, May 16, 2026

Show HN: Got ghosted by tech companies so I built a tool to track ghost jobs https://ift.tt/NUCX1zk

Show HN: Got ghosted by tech companies so I built a tool to track ghost jobs Last year I was looking for a new role. I sent out applications, did the prep, waited. What came back was mostly nothing. Not rejection emails, just silence. The job listings I'd applied to stayed live for weeks. Some for months. As a software engineer, I decided to dig into it properly. I built a system to continuously track job postings across companies, logging posting dates and measuring how long roles stay open before closing or don't. After 35,000+ listings across 200+ companies, some patterns are hard to ignore. Some listings have been open for 700+ days at companies you'd recognize. Others post 90% of their open roles within a single month, a signal that's harder to fake than a press release. I published two initial insight pages based on this work: - Which companies are posting most aggressively right now - Job listings that have been open for over a year What I didn't expect is that the same signals useful for detecting ghost jobs also say something broader about a company's hiring momentum, recruiting intensity, pipeline health, where talent bottlenecks might exist. I'm not sure yet where this leads, but I'll keep expanding the dataset and publishing more insights as I go. Would genuinely love feedback on the methodology, interpretation, or obvious blind spots in the data. https://ift.tt/h0R5zCK May 17, 2026 at 03:43AM

Show HN: Hermes-agentmemory, pull-model episodic memory with real deletes https://ift.tt/oLaTbKF

Show HN: Hermes-agentmemory, pull-model episodic memory with real deletes https://ift.tt/IehkSWK May 17, 2026 at 01:00AM

Show HN: Rocksky – Music scrobbling and discovery on the AT Protocol https://ift.tt/yh4lcwB

Show HN: Rocksky – Music scrobbling and discovery on the AT Protocol https://ift.tt/H3XtkwY May 17, 2026 at 12:00AM

Friday, May 15, 2026

Show HN: Browser based sythesizer, drum machine and squencer https://ift.tt/KSy51Bs

Show HN: Browser based sythesizer, drum machine and squencer Inspired by the recent Boards Of Canada announcement, I've been in a low-fi electronica mood lately and was going back and forth with Claude on how to design similar instruments in the browser that fit the genre. One thing led to another and pretty soon I had a fully browser based polyphonic synthesizer / drum machine / sequencer. The interface and workflow was heavily inspired by the Rebirth338 application released back in the 90's, but with lo-fi synth voices rather than the original 303 & 808 emulation. I know there's a significant overlap of developers and musicians and I though some of you may enjoy playing with the app, or at least listening to the resulting album. I've also open sourced track 1 of the album via the performance script used to record it. It's in the repo. Bandcamp link to the resulting album: https://ift.tt/jtIDYqK... https://ift.tt/u8KyEls May 16, 2026 at 03:07AM

Show HN: Claude Code vs. Codex Global Usage Leaderboard https://ift.tt/12F6hrW

Show HN: Claude Code vs. Codex Global Usage Leaderboard https://ift.tt/cD3Z2u9 May 16, 2026 at 02:18AM

Show HN: Burn, baby, burn (those tokens) https://ift.tt/qRr9a8x

Show HN: Burn, baby, burn (those tokens) https://ift.tt/NJorQFR May 16, 2026 at 12:20AM

Show HN: Sx – an open-source package manager for AI skills, MCPs, and commands https://ift.tt/6EFQPav

Show HN: Sx – an open-source package manager for AI skills, MCPs, and commands https://ift.tt/ebcTCZo May 16, 2026 at 12:03AM

Thursday, May 14, 2026

Show HN: Halgorithem – Catching AI Hallucinations Using Trees, No AI in Pipeline https://ift.tt/Dqx3OLa

Show HN: Halgorithem – Catching AI Hallucinations Using Trees, No AI in Pipeline https://ift.tt/X20g9Mp May 14, 2026 at 10:38PM

Show HN: Yes We Scan: rescue old scanners with an in-browser Linux VM and WebUSB https://ift.tt/vi3c2zt

Show HN: Yes We Scan: rescue old scanners with an in-browser Linux VM and WebUSB https://ift.tt/Awok9IU May 14, 2026 at 11:25PM

Wednesday, May 13, 2026

Show HN: Neural window manager, neural network moving windows from mouse actions https://ift.tt/cgGRh2o

Show HN: Neural window manager, neural network moving windows from mouse actions I'd been mulling over this crazy idea for a while. Can programs be generated? Inspired by recent advances in world models, I wondered if we could do away with source code and generate pixels directly and interactively. As an experiment to answer this, I set out to create a neural window manager, training a neural network to predict what the screen would look like next. Basically, the idea was to generate the next frame based on the last two frames and the mouse position. That's it: moving windows without programming an event system, just a simple convolutional neural network guessing pixels. To implement the experiment, I used Pygame to simulate a turquoise desktop background, a gray window with a navy blue title bar, a white cursor, and four colors in total. Then, a bot randomly dragged the window, and I recorded everything, processing the frames as color index matrices (not RGB, to avoid complications) and the mouse delta (dx, dy, click) that caused each transition. 8000 frames, a few minutes in Colab. The model is a unitary neural network (UNET). The encoder compresses the stacked frames, the decoder reconstructs the next one, and the mouse vector coordinates are projected with a linear layer to fit the spatial size of the bottleneck. There, they are concatenated before decoding, so that motion information feeds each jump connection. And it works! Which still surprises me a little. You can drag, and the window follows you; when you release, it stops. There's no internal state, no (x, y) coordinates anywhere. The model infers the position from what it sees, which works until it doesn't. But after a couple of seconds of strange movement, the window starts to distort. This will probably improve with more computing power for training and more examples, but to narrow the scope of the experiment and test it within a web browser, I decided to abandon the rendering aspect and have the model predict primitives instead of pixels, simply converting the motion engine into a neural network. Basically, I trained a small MLP to receive (distance to the title bar, distance to the resize point, click) and generate (dx, dy, dw, dh), with two separate heads: one for moving and one for resizing. The trick is that they share nothing except the click signal, so the model can't confuse dragging with resizing. I then exported it to ONNX as well, and now everything runs in the browser, without a server, just a canvas element and two small neural networks communicating with each other. With this new approach, the renderer remains deterministic, with rectangles drawn in JavaScript, but the window's behavior (where it moves, how it resizes) is learned from examples. It feels like a peculiar middle ground between traditional and neural, so you can feel the space the network has learned by interacting with it: dragging near the title bar moves it, but approaching the corner resizes the window. There are no conditionals or hitbox code; the network simply learned where those areas are from examples. Sometimes it gets confused near the edges, which, frankly, is more interesting than if it worked perfectly; you can perceive how the probability changes. This makes sense when you think about it, because no (x, y) coordinates are stored in these models; the position is implied in the activations. It works well for short sequences, but fails when asked to maintain state over time. Update: A few weeks later, Meta published the Neural Computers article (2604.06425, it's worth reading). The premise is the same, but they go much further: cli and uis, real programs. Their failure modes are practically identical to those I found with the pure pixel version: "challenges persist with routine reuse, controlled updates, and symbolic stability." which is a fancy way of saying that the window blurs after a few seconds (that was the reason for choosing deterministic rendering). https://lusob.github.io/neural-os/ May 14, 2026 at 12:46AM

Show HN: Splice – A programming language with custom VM for embedded systems https://ift.tt/1J7UGZu

Show HN: Splice – A programming language with custom VM for embedded systems https://ift.tt/aVm50Kh May 13, 2026 at 10:01PM

Show HN: Mistle – Open-source infrastructure for running sandboxed coding agents https://ift.tt/2VB1Q9m

Show HN: Mistle – Open-source infrastructure for running sandboxed coding agents Hi HN, I'm Jonathan. My co-founder, Thomas, and I started building Mistle in Feb. We saw larger tech companies like Ramp (Inspect) and Stripe (Minions) build this internally and thought an open source version should exist. We made a few very intentional decisions when working on this: 1. Credentials are kept out of the sandbox. Authorized access goes through a proxy, so agents do not directly receive credentials. 2. The harness is not our problem. We're not going to tackle things like memory, self-learning. 3. No magic. Configurations are explicit. You can bring your own keys for models, sandboxes, and other providers. You can write your own instructions and agent. Mistle can be run locally with a single command: https://ift.tt/dHQSJoC Questions, feedback and ideas are welcome! https://ift.tt/ciOf7VK May 13, 2026 at 09:37PM

Tuesday, May 12, 2026

Show HN: Gigacatalyst – Extend your SaaS with an embedded AI builder https://ift.tt/TSp7ALG

Show HN: Gigacatalyst – Extend your SaaS with an embedded AI builder Hi HN, I’m Namanyay from Gigacatalyst (link: https://ift.tt/zUEf2xX ). Gigacatalyst allows sales, CS, and users to build one-off features, so your SaaS can support long-tail customer workflows and engineers aren’t pulled away from the roadmap. When you sell software to large businesses, you realize that each customer needs their own workflow and features. Traditionally, this either means long engineering roadmaps or the customers end up using workarounds. But what if everyone could build their critical missing features just by talking to an AI? That’s what we do at Gigacatalyst. We provide an AI customization layer for your customers, CS team, and sales team to build these missing critical workflows without needing any engineers at all. Think Lovable, but built on top of YOUR platform. We connect to your product's APIs, learn your data model and design system, and let non-technical users build governed apps via natural language - inside your product, under your brand. Here’s what it looks like in action: https://www.youtube.com/watch?v=_taSpSphH6E One of our customers, a Series B company, saw their users ( not engineers - managers, ops people, facility directors) build critical workflows like: - Parts stockout prevention: A maintenance manager typed "show me which parts will run out in the next 2 weeks based on usage over the last 90 days, accounting for vendor lead times." The app tracks consumption velocity, forecasts stockouts, and alerts before it's too late. He says it's prevented ~$500K in emergency downtime. - Invoice OCR from phone photos: Technicians kept losing paper invoices. The prompt: "upload a photo of the invoice, extract vendor name, date, amount, and line items, then match it to the purchase order and flag discrepancies." Now techs snap a photo on-site to automatically add to the system of record. - Restaurant emergency triage: A pizza chain's facilities manager was drowning in maintenance requests. He built a priority matrix: "walk-in freezer not cooling" auto-routes as CRITICAL, "dining room light flickering" goes to LOW. He's now able to manage backlogs with the correct priority. How Gigacatalyst works under the hood: 1. Agentic API discovery: Our agents go through your app and parse your endpoints, query params, request/response shapes, and sample data to build the base layer. 2. Generation and Validation: When a user describes what they want our AI generates an app. We set up multiple validation steps, including static checks, runtime error analysis, and LLM-as-a-judge. 3. Sandboxing and Compilation: We wrote our own compilation and sandboxing framework to get the fastest speeds and lowest costs. This means that users can interact with the built app in seconds. 4. Proxy layer: We create a proxy layer for all APIs to handle auth, tenant isolation, and rate limiting. Everything the agent has access to is controlled, logged, observed, and version controlled. After 2000+ daily users, 900+ apps built, and 70% 30-day retention, today we're opening a public demo. Try it: https://ift.tt/OnuSl4D - enter your SaaS product's API URL (or just the homepage) and start prompting. If you're serving a variety of use cases, you probably deal with a lot of custom requests and Gigacatalyst will save you time and increase your bottom line. Book a meeting at https://ift.tt/l3ZDX7R and I'll help your team and customers build new functionality on top of your platform. I've been reading Hacker News since I was 12 years old. I'm proud to launch for all of you and I want to hear your feedback on my product and comments! May 12, 2026 at 11:32PM