Asking AI which political party to vote for? Not a good idea
A chatbot that tells you who to vote for in the parliamentary elections? According to professor Tim Christiaens, that carries risks. ‘Think for yourself about what kind of society you want; AI can’t do that for you.’

As the parliamentary elections are getting close, the familiar voting aids are popping up everywhere again. But alongside tools like Stemkompas and Kieswijzer, some people are now also consulting AI, such as ChatGPT. A simple question: Which political party should I vote for? This immediately produces an answer. That advice may seem convincing, but it is often not embedded in the proper political context.
Misleading and superficial advice
This starts with the sources which are problematic, says Tim Christiaens professor at Tilburg School of Humanities and Digital Sciences (TiSEM). A chatbot may draw among other things from columns and opinion pieces. ‘If the information comes from opinion articles, where numbers are sometimes manipulated, then the chatbot gives misleading advice,’ Christiaens explains.
People with less political knowledge in particular run the risk of receiving misleading or superficial voting advice.
There is also the problem of sycophancy: the tendency of AI to confirm users in their beliefs. ‘If you start from a misguided opinion and the machine keeps reinforcing that, you end up in a rabbit hole that is hard to escape,’ Christiaens makes clear. He names this as a main reason why AI cannot provide neutral voting advice.
Handing over responsibility
What Christiaens warns against is cognitive offloading: outsourcing your thought process to technology. ‘The biggest risk is that people hand over their responsibility and think: the chatbot will decide who I should vote for.’
As a result, voters reflect less on political choices themselves. They base their preferences on the answers AI gives rather than on their own values and principles.
Christiaens believes it’s better to stick with traditional alternatives, such as watching the news or following debates. In his view, politics is not the same as AI. It’s about the worldview behind a political party, something AI often knows nothing about. ‘Politics is ultimately about thinking for yourself about what kind of society you want. A database can’t do that for you,’ he says.
An alternative chatbot that informs
Christine Liebrecht and Naomi Kamoen, who are researching whether chatbots can help voters understand the elections, share Christiaens’ concerns about AI as a source of voting advice. That’s why they are currently developing their own chatbot. ‘The goal of our chatbot is to inform, not to advise. The chatbot explains difficult terms and provides context on issues, but does not tell you which party to choose,’ says Liebrecht.
Because Liebrecht and Kamoen formulate all the answers themselves, their chatbot does not learn autonomously and does not invent information. In this way, the earlier problems are avoided, and a safe alternative to voting tools emerges without the risks of a self-directing AI such as ChatGPT.