Buy, Borrow, Die – Explained
23 by nkurz | 9 comments on Hacker News.
Saturday, August 31, 2024
Friday, August 30, 2024
Thursday, August 29, 2024
New top story on Hacker News: Show HN: A discovery-focused search engine for Hacker News
Show HN: A discovery-focused search engine for Hacker News
20 by skeptrune | 5 comments on Hacker News.
We (Nick, Dens, Denzell, Fede, Drew, Aaryan, and Daniel) have been building HN Discovery, a discovery-focused search engine for Hacker News, in our spare time for the past 6 months and are excited to show it! It adds the following features relative to the existing keyword search interface and preserves the existing ones: - no-JS version (hnnojs.trieve.ai) - site:{required_site} and site:{negated-site} filters - public analytics - LLM generated query suggestions based on random stories - recommendations - dense vector semantic search - SPLADE fulltext search - RAG AI chat - order by descendant count client code (FOSS self-hostable) - https://ift.tt/wcm0CTb engine code (BSL source-available) - https://ift.tt/Ha4Ebq1 There is an extended about page with detailed information on features, how much it costs to run, etc. here - https://ift.tt/sUyjfh0 .
20 by skeptrune | 5 comments on Hacker News.
We (Nick, Dens, Denzell, Fede, Drew, Aaryan, and Daniel) have been building HN Discovery, a discovery-focused search engine for Hacker News, in our spare time for the past 6 months and are excited to show it! It adds the following features relative to the existing keyword search interface and preserves the existing ones: - no-JS version (hnnojs.trieve.ai) - site:{required_site} and site:{negated-site} filters - public analytics - LLM generated query suggestions based on random stories - recommendations - dense vector semantic search - SPLADE fulltext search - RAG AI chat - order by descendant count client code (FOSS self-hostable) - https://ift.tt/wcm0CTb engine code (BSL source-available) - https://ift.tt/Ha4Ebq1 There is an extended about page with detailed information on features, how much it costs to run, etc. here - https://ift.tt/sUyjfh0 .
Wednesday, August 28, 2024
New top story on Hacker News: Show HN: Shed Light on Your Go Binary Bloat with Go Size Analyzer
Show HN: Shed Light on Your Go Binary Bloat with Go Size Analyzer
12 by zxilly | 1 comments on Hacker News.
I've created a powerful tool to help Go developers uncover the hidden giants in their compiled binaries. Go Size Analyzer is like an X-ray machine for your Go executables, revealing: Which dependencies are eating up your binary size Unexpected bloat from standard library or vendor packages Size changes between binary versions with a visual diff Key features that set it apart: Interactive treemap visualizations (check out the demo: https://gsa.zxilly.dev ) Slick terminal UI for deep diving into package hierarchies Cross-platform support (works on Linux, macOS, and Windows binaries) Export to SVG for easy sharing and documentation or just visualize the CI process Whether you're optimizing for edge devices, reducing Docker image sizes, or just curious about what's really inside your Go binaries, this tool provides detailed insights. .
12 by zxilly | 1 comments on Hacker News.
I've created a powerful tool to help Go developers uncover the hidden giants in their compiled binaries. Go Size Analyzer is like an X-ray machine for your Go executables, revealing: Which dependencies are eating up your binary size Unexpected bloat from standard library or vendor packages Size changes between binary versions with a visual diff Key features that set it apart: Interactive treemap visualizations (check out the demo: https://gsa.zxilly.dev ) Slick terminal UI for deep diving into package hierarchies Cross-platform support (works on Linux, macOS, and Windows binaries) Export to SVG for easy sharing and documentation or just visualize the CI process Whether you're optimizing for edge devices, reducing Docker image sizes, or just curious about what's really inside your Go binaries, this tool provides detailed insights. .
Tuesday, August 27, 2024
Monday, August 26, 2024
Sunday, August 25, 2024
Saturday, August 24, 2024
Friday, August 23, 2024
Thursday, August 22, 2024
Wednesday, August 21, 2024
Tuesday, August 20, 2024
Monday, August 19, 2024
New top story on Hacker News: Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data
Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data
75 by tndl | 30 comments on Hacker News.
Hey HN! We’re Max, Alex, and Austin, the team behind Sorcerer ( https://sorcerer.earth ). Sorcerer builds weather balloons that last for over six months, collecting 1000x more data per dollar and reaching previously inaccessible regions. In 1981, weather disasters caused $3.5 billion in damages in the United States. In 2023, that number was $94.9 billion ( https://ift.tt/nLiH0VR ). The National Weather Service spends billions annually on its network of weather balloons, satellites, and aircraft sensors – generating hundreds of terabytes of data every day. This data, called observation data, is fed into massive supercomputers running advanced physics to produce global weather forecasts. Despite this cost, there are still places in the US where we don't know what the temperature will be two days from now: https://ift.tt/iySALuX... . And for the rest of the world that lacks weather infrastructure? There’s always the Weather Rock: https://ift.tt/AymXV1d . The most important data for these forecasts come from vertical data ‘slices’ of the atmosphere, called soundings. Every day 2,500 single-use latex radiosondes are launched across the globe to collect these soundings. They stay aloft for about two hours before popping and falling back to Earth. Launch sites for these systems are sparse in Latin America and Africa, and they’re completely non-existent over oceans. This leaves about 80% of the globe with inadequate weather data for accurate predictions. The coverage gap became painfully evident to Max and Alex during their time at Urban Sky. While building balloons for high-altitude aerial imaging, they kept running into a problem: no matter what weather forecast they used, they couldn’t get accurate wind predictions for the upper atmosphere. They tried all of the free and commercial forecast products, but none of them were accurate enough. Digging into it more, we learned that a big part of the problem was the lack of high-quality in-situ data at those altitudes. To solve this problem, our systems ascend and descend between sea level and 65,000ft several times a day to collect vertical data soundings. Each vehicle (balloon + payload) weighs less than a pound and can be launched from anywhere in the world, per the FAA and ICAO reg. Here’s one we launched from Potrero Hill in SF, https://youtu.be/75fN5WpRWH0 and here’s another near the Golden Gate Bridge, https://youtu.be/7yLmzLPUFVQ . Although we can’t “drive” these balloons laterally, we can use opposing wind layers to target or avoid specific regions. Here’s what a few simulated flight paths look like, to give you an idea: https://youtu.be/F_Di8cjaEUY Our payload uses a satellite transceiver for communications and a small, thin film solar panel array to generate power. In addition to the weather data, we also get real-time telemetry from the vehicles, which we use to optimize their flight paths. This includes maintaining the optimal spacing between balloons and steering them to a recovery zone at the end of their lifespan so we can recycle them. These systems spend most of their time in the stratosphere which is an extremely unforgiving environment. We’ll often see temperatures as low as -80°C while flying near the equator. Throughout the day, we experience extreme temperature cycling as they ascend and descend through the atmosphere. We’ll often encounter 100mph+ wind shears near the boundary with the troposphere (the tropopause) that can rip apart the balloon envelope. These conditions make the stratosphere a very difficult place to deploy to prod. The real magic of what we’re building will come into play when we have hundreds of these systems in the air over data-sparse regions. But even now, we can do useful and interesting things with them. Some of our early customers are companies who fly very big, very expensive things into the stratosphere. They use our balloons to give them a clear idea of what conditions are ahead of their operations, and we’re working on a forecast product specifically designed for the stratosphere. The combination of long duration and low cost is novel. We can theoretically maintain thousands of balloons in the atmosphere at any given time for a tenth of the cost of one useful weather satellite. We’re also using the data we collect to train AI models that produce forecasts with better accuracy than existing numerical (supercomputer) forecasts. Because we’re collecting totally unique data over areas that lack observation, our models will maintain a consistent edge versus models that are only trained on open data. We’re really excited to be launching Sorcerer here with you! We’d love to hear what you think. And if you find one of our balloons in the Bay Area: Sorry! It’s still a work in progress (and please get it back to us). I’ll leave you all with a bonus video of Paul Buchheit launching one of our balloons, which we thought was pretty cool: https://www.youtube.com/watch?v=-sngF9VvDzg
75 by tndl | 30 comments on Hacker News.
Hey HN! We’re Max, Alex, and Austin, the team behind Sorcerer ( https://sorcerer.earth ). Sorcerer builds weather balloons that last for over six months, collecting 1000x more data per dollar and reaching previously inaccessible regions. In 1981, weather disasters caused $3.5 billion in damages in the United States. In 2023, that number was $94.9 billion ( https://ift.tt/nLiH0VR ). The National Weather Service spends billions annually on its network of weather balloons, satellites, and aircraft sensors – generating hundreds of terabytes of data every day. This data, called observation data, is fed into massive supercomputers running advanced physics to produce global weather forecasts. Despite this cost, there are still places in the US where we don't know what the temperature will be two days from now: https://ift.tt/iySALuX... . And for the rest of the world that lacks weather infrastructure? There’s always the Weather Rock: https://ift.tt/AymXV1d . The most important data for these forecasts come from vertical data ‘slices’ of the atmosphere, called soundings. Every day 2,500 single-use latex radiosondes are launched across the globe to collect these soundings. They stay aloft for about two hours before popping and falling back to Earth. Launch sites for these systems are sparse in Latin America and Africa, and they’re completely non-existent over oceans. This leaves about 80% of the globe with inadequate weather data for accurate predictions. The coverage gap became painfully evident to Max and Alex during their time at Urban Sky. While building balloons for high-altitude aerial imaging, they kept running into a problem: no matter what weather forecast they used, they couldn’t get accurate wind predictions for the upper atmosphere. They tried all of the free and commercial forecast products, but none of them were accurate enough. Digging into it more, we learned that a big part of the problem was the lack of high-quality in-situ data at those altitudes. To solve this problem, our systems ascend and descend between sea level and 65,000ft several times a day to collect vertical data soundings. Each vehicle (balloon + payload) weighs less than a pound and can be launched from anywhere in the world, per the FAA and ICAO reg. Here’s one we launched from Potrero Hill in SF, https://youtu.be/75fN5WpRWH0 and here’s another near the Golden Gate Bridge, https://youtu.be/7yLmzLPUFVQ . Although we can’t “drive” these balloons laterally, we can use opposing wind layers to target or avoid specific regions. Here’s what a few simulated flight paths look like, to give you an idea: https://youtu.be/F_Di8cjaEUY Our payload uses a satellite transceiver for communications and a small, thin film solar panel array to generate power. In addition to the weather data, we also get real-time telemetry from the vehicles, which we use to optimize their flight paths. This includes maintaining the optimal spacing between balloons and steering them to a recovery zone at the end of their lifespan so we can recycle them. These systems spend most of their time in the stratosphere which is an extremely unforgiving environment. We’ll often see temperatures as low as -80°C while flying near the equator. Throughout the day, we experience extreme temperature cycling as they ascend and descend through the atmosphere. We’ll often encounter 100mph+ wind shears near the boundary with the troposphere (the tropopause) that can rip apart the balloon envelope. These conditions make the stratosphere a very difficult place to deploy to prod. The real magic of what we’re building will come into play when we have hundreds of these systems in the air over data-sparse regions. But even now, we can do useful and interesting things with them. Some of our early customers are companies who fly very big, very expensive things into the stratosphere. They use our balloons to give them a clear idea of what conditions are ahead of their operations, and we’re working on a forecast product specifically designed for the stratosphere. The combination of long duration and low cost is novel. We can theoretically maintain thousands of balloons in the atmosphere at any given time for a tenth of the cost of one useful weather satellite. We’re also using the data we collect to train AI models that produce forecasts with better accuracy than existing numerical (supercomputer) forecasts. Because we’re collecting totally unique data over areas that lack observation, our models will maintain a consistent edge versus models that are only trained on open data. We’re really excited to be launching Sorcerer here with you! We’d love to hear what you think. And if you find one of our balloons in the Bay Area: Sorry! It’s still a work in progress (and please get it back to us). I’ll leave you all with a bonus video of Paul Buchheit launching one of our balloons, which we thought was pretty cool: https://www.youtube.com/watch?v=-sngF9VvDzg
Sunday, August 18, 2024
Saturday, August 17, 2024
Friday, August 16, 2024
Thursday, August 15, 2024
New top story on Hacker News: Show HN: Denormalized – Embeddable Stream Processing in Rust and DataFusion
Show HN: Denormalized – Embeddable Stream Processing in Rust and DataFusion
28 by ambrood | 6 comments on Hacker News.
tl;dr we built an embeddable stream processing engine in Rust using apache DataFusion, check us out at https://ift.tt/ornYbIm Hey HN, We’d like to showcase a very early version of our embeddable stream processing engine called Denormalized. The rise of DuckDB has abundantly made it clear that even for many workloads of Terabyte scale, a single node system outshines the distributed query engines of previous generation such as Spark, Snowflake etc in terms of both performance and cost. Now a lot of workloads DuckDB is used for were normally considered to be “big data” in the previous generation, but no more. In the context of streaming especially, this problem is more acute. A streaming system is designed to incrementally process large amounts of data over a period of time. Even on the upper end of scale, productionized use-cases of stream processing are rarely performing compute on more than tens of gigabytes of data at a given time. Even so, the standard stream processing solutions such as Flink involve spinning up a distributed JVM cluster to even compute against the simplest of event streams. To that end, we’re building Denormalized designed to be embeddable in your applications and scale up to hundreds of thousands of events per second with a Flink-like dataflow API. While we currently only support Rust, we have plans for Python and Typescript bindings soon. We’re built atop DataFusion and the Arrow ecosystems and currently support streaming joins as well as windowed aggregations on Kafka topics. Please check out out repo at: https://ift.tt/ornYbIm We’d love to hear your feedback.
28 by ambrood | 6 comments on Hacker News.
tl;dr we built an embeddable stream processing engine in Rust using apache DataFusion, check us out at https://ift.tt/ornYbIm Hey HN, We’d like to showcase a very early version of our embeddable stream processing engine called Denormalized. The rise of DuckDB has abundantly made it clear that even for many workloads of Terabyte scale, a single node system outshines the distributed query engines of previous generation such as Spark, Snowflake etc in terms of both performance and cost. Now a lot of workloads DuckDB is used for were normally considered to be “big data” in the previous generation, but no more. In the context of streaming especially, this problem is more acute. A streaming system is designed to incrementally process large amounts of data over a period of time. Even on the upper end of scale, productionized use-cases of stream processing are rarely performing compute on more than tens of gigabytes of data at a given time. Even so, the standard stream processing solutions such as Flink involve spinning up a distributed JVM cluster to even compute against the simplest of event streams. To that end, we’re building Denormalized designed to be embeddable in your applications and scale up to hundreds of thousands of events per second with a Flink-like dataflow API. While we currently only support Rust, we have plans for Python and Typescript bindings soon. We’re built atop DataFusion and the Arrow ecosystems and currently support streaming joins as well as windowed aggregations on Kafka topics. Please check out out repo at: https://ift.tt/ornYbIm We’d love to hear your feedback.
Wednesday, August 14, 2024
New top story on Hacker News: Show HN: I've open sourced DD Poker
Show HN: I've open sourced DD Poker
74 by dougdonohoe | 10 comments on Hacker News.
I'm the original author of DD Poker, a Java-based computer game that ran on Mac, Linux and Windows and originally sold in stores in physical boxes. I shut down the backend servers in 2017 but the game is still functional and people can still play each other online even though the central lobby and find-a-game functionality no longer work. I've been asked over the years to release the source code, especially during the pandemic, and again this year. I finally got motivated to clean up the code and put it out there. The code is 20 years old and uses some ancient Spring, log4j, Wicket and other dependencies, but it still works on Java 1.8.
74 by dougdonohoe | 10 comments on Hacker News.
I'm the original author of DD Poker, a Java-based computer game that ran on Mac, Linux and Windows and originally sold in stores in physical boxes. I shut down the backend servers in 2017 but the game is still functional and people can still play each other online even though the central lobby and find-a-game functionality no longer work. I've been asked over the years to release the source code, especially during the pandemic, and again this year. I finally got motivated to clean up the code and put it out there. The code is 20 years old and uses some ancient Spring, log4j, Wicket and other dependencies, but it still works on Java 1.8.
Tuesday, August 13, 2024
Monday, August 12, 2024
Sunday, August 11, 2024
Saturday, August 10, 2024
Friday, August 9, 2024
Thursday, August 8, 2024
Wednesday, August 7, 2024
Tuesday, August 6, 2024
Monday, August 5, 2024
Sunday, August 4, 2024
Saturday, August 3, 2024
Friday, August 2, 2024
Thursday, August 1, 2024
New top story on Hacker News: Ask HN: What Are You Working On? (August 2024)
Ask HN: What Are You Working On? (August 2024)
6 by david927 | 5 comments on Hacker News.
What are you working on? Any new ideas that you're thinking about?
6 by david927 | 5 comments on Hacker News.
What are you working on? Any new ideas that you're thinking about?