What's the hell of a role of Artificial Intelligence in determining well-being of researchers?
As a researcher in academia, you might be anxious due to the fast pace of Artificial Intelligence (AI) advancements. You have limited access to the computational resources required for impressive and impacting AI research programme. Not to mention human resources (colleagues). A fact is that:
As someone who does AI research in a university, you develop a complicated relationship to the corporate AI research powerhouses, such as DeepMind, Open AI, Google Brain and Meta AI.
Private corporations and startups have much better options, and they're using them for the good of research and business. So what can you do to stay competitive while remaining an academic? This is no definitive guide but a good starting point for a discussion on the topics of excelling without being private.
In this brief discussion around classification and databases technology, what comes up is that nature doesn’t come with what we call identifiers. Humans give labels to things: names, codes and symbols; most of them are arbitrary human inventions.
For very small problem sizes a classical computer is faster
Is it correct to look for the speed and promises of Quantum Computing without asking ourselves if we also need to revise our algorithms?
Is it time to replace Gross Domestic Product (GDP) as a metric for well-being and sustainability? Is this needed for changing our perspective when we face climate change that endangers our existence?
Joseph E. Stiglitz, University Professor at Columbia University and chief economist at the Roosevelt Institute, winner of the Nobel prize in economics in 2001, suggested this in 2020.