Super interesting concept! I'm very interested in whether or not your implementation would be able to accurately predict which frontiers are likely to have upcoming breakthroughs and advancements. The most interesting thing would be if it could predict unexpected advancements in fields that didn't have steady progress recently. Obviously predicting "this gradually advancing field will keep advancing gradually" is easy, but predicting something unlikely would really be neat. I do wonder, however, how you're planning on defining breakthroughs? What would be the metrics to determine whether or not the algorithm's guesses were correct? Plenty of researchers publish very self-important papers with big claims, which never lead to any manifest improvements in real technology. How would you define the difference? Anyway, if you have a prototype version of this going somewhere, I'd love to play with it!
Viewing post in Innovation Lens jam comments
hey @extenebrislucet! thanks for the feedback. The metric we used for validation was citation count ("breakthrough" = within 24 months an article in top 15% by citation count is published within a given radius epsilon of the prediction previously made). There are definitely some surprising predictions that don't belong to steadily advancing fields. The most impressive is that we predicted the importance of llm code generation back in 2023, when Cursor was just starting out. There are several videos about this prediction at youtube.com/@innovationlens There is a live free demo at innovationlens.org, with restricted access. For a small fee you can access fresher data too.
Give it a try, I'd love to hear your feedback!