Text illustrations created with Midjourney
How public opinion polls are actually conducted
Levada Center publishes reports that support for the war in Ukraine a year after it began has grown from 68% to 75%, approval of Vladimir Putin's actions from 71% to 83%, and abstentions – only 3% of the total number of respondents . At the same time, there is no data on the center’s website on how the surveys were conducted, how many respondents took part in it, and most importantly, how many refused to answer. Most of the other polls say they were done through door-to-door and face-to-face interviews with 1,600 people. This in itself is a worrying indicator, since Russians are reluctant to open doors to strangers and even less likely to criticize the authorities in conversation with a stranger who knows where they live.
In general, four main methods of public opinion polls are used in modern sociology: telephone conversation , when a number is selected by a computer from a common database of mobile and landline numbers, and the conversation is conducted by an interviewer; street survey , which allows the interviewer to accompany the respondent on the road, reducing the amount of time spent; door-to-door rounds , when the interviewer comes to the respondent's home without prior arrangement; and the easiest method for a pollster is voting on the Internet , but it is more difficult to control a representative sample there.
Of all these types of surveys, door-to-door surveys are the least anonymous, and therefore responsible sociologists try not to use them in surveys about war and politics (if they decide to conduct such surveys at all). And it's not just the lack of anonymity, sociologist Grigory Yudin explains:
“In route surveys, face to face interviews, when you have interviewers going from house to house and talking to people, the main problem is the inability to enter certain types of houses, because they are surrounded by a fence, intercoms, security guards and so on. You can't just go to these people. This is an important point also because the Levada Center, for some reason completely unknown to me, insists that they make route selections. Route fetches are absolute madness. This is much worse than telephone surveys. And not only because of the displacement, but also because they are more difficult to control. Because the unfortunate interviewers at some point begin to “draw” these questionnaires, assuming that they themselves know what the respondents would answer. In telephone surveys, if it is a call center where all conversations are recorded, this is more difficult to do.”
Since the beginning of the full-scale invasion of Ukraine, apart from the three most famous Russian pollsters (VTsIOM, FOM, Levada Center), two independent sociological projects, Russian Field and Khroniki, have been dealing with the topic of Russian attitudes to the war. Both of them use for their research only the telephone survey method, which they consider optimal for approaching the real position of the respondent. Russian Field sociologist Daria Pavlova explains how they achieve representativeness in their polls:
“We start with sample modeling. We roughly understand how many people and what ages live in the regions. If you need to interview 100 people, then we assume that 15% of them should be young people, 30% pensioners, and so on. All this is formed into quotas. Surveys are now conducted using various applications where you can specify what age and gender you need a person, and when you exceed this number, the respondent can no longer take the survey, the operator tells him that his quota is closed, and the survey ends.
If a person agrees, his age is specified, the operator himself determines the gender by the voice he hears, and if the person passes the quota, then he participates in the survey. At the end, additional socio-demographic parameters are already set – the level of education, income and other things that may be of interest to us. We do not adjust them to the data of Rosstat, because Rosstat has implausible income data, there are no data on education in the required amount either, and this is an additional issue.”
Unlike VTsIOM, FOM, and the Levada Center, Russian Field and Khronika publish the results of their research indicating the response rate indicator, which is extremely important for sociologists—the proportion of those who agreed to be interviewed. Why big social services bashfully hide this figure is understandable: in Russia today it is negligibly low.
Nobody wanted to answer
The share of those who agree to answer questions, or the response rate (RR), in Russia, as in many other countries, has been declining over the years. When Levada Center sociologist Alexei Levinson is asked if polls with low RR can be trusted, he usually answers that, they say, in the West it is also low and nothing, polls are trusted. In some Western countries, RR is indeed low, but it is not the RR indicator itself that is important, but how the decrease in RR affects the error (nonresponse bias). For example, in the US, RR in telephone polls has fallen from about 40% in the 90s to 6% today, which is very disturbing to sociologists. But in general, they concluded that even with such a low RR, surveys can still be representative if people's decision to participate in a survey can have nothing to do with their answers. In the US, people stopped answering phone surveys because of the prevalence of spam disguised as surveys, and all categories of people dislike spam about the same. So let's say if you're interested in what Americans think of presidential candidates, a low RR won't have much of an impact on margin of error. In Russia, the RR in telephone surveys is the same as in the US (5–7%), but in the Russian Federation the reasons for not answering are very often directly related to politics.
In Russia, about 94 out of 100 people do not answer telephone surveys of sociologists
Sociologist Daria Pavlova gives some examples of respondents' motivation:
“The main rejections that we analyzed in the questionnaires happen silently: people just hang up without trying to explain why they do it. But there are those who say something. So, the main reason: “Why are you calling me? I'm not a politician, I'm not an expert, you don't have to ask me." The second reason: “Busy, no time. Do not want to talk". The third reason: “I’m afraid that you are scammers, ask for my passport now, you know my phone number, then you will write off the money.” The fourth category has fears that they are calling from Ukraine. At some point on the Internet, in pro-Russian public pages, the topic went that if someone calls you and asks you about current events, you need to ask: “Whose Crimea?” – and if the operator does not answer, then they call from Ukraine and talk no need.
Operators complained to us that this happened regularly. The operator as a whole does not have the right to deviate from the logic of the questionnaire, does not have the right to answer questions himself, except for those related to the method of conducting the survey. The operator explains to the person that he is not answering questions, and the person says: “So you are calling us from the Armed Forces of Ukraine. Goodbye!“ There are also many people who think that provocateurs are calling them. Of course, there are also those who are dropped, because “they will be tracked by number and the FSB will come after them.” This is a small category in rejections, one of the smallest groups. But we will most likely never be able to understand how many of them there really are, and we will never be able to understand what kind of displacement either.
One of the main reasons for refusing to answer is distrust, in particular distrust of the state, and in Russia it is traditionally believed that the one who conducts the survey is connected with the state, sociologist Grigory Yudin explains:
“In Russia, polls are perceived as communication with the state, as an instrument of supervision. In America, of course, there is nothing like that, it is known that polls are conducted by institutions, and they may have different political loyalties. Some of them are closer to the Democrats, some to the Republicans, some are generally purely university, but no one believes that this state is carrying out. And in Russia, the overwhelming majority of respondents think exactly that, and many consider this a good reason to complain about the “bad boyars”. Any interviewer has heard the phrase more than once: “Tell Putin that…” and so on. It is useless to explain that you have nothing to do with Putin, that you generally represent the Levada Center, and this is an independent organization. "Well, who are you kidding? Yesterday Putin spoke on TV, saying: let's conduct a survey, but today you are telling me that you have nothing to do with Putin. The Levada Center is not to blame for this, but this does not change the situation in any way.”
Levada desperately resists these arguments. In 2022, the organization published a special study on this topic in order to identify a portrait of people who do not respond to surveys, and came to the conclusion that there is no fundamental difference between responders and non-responders. From the very beginning, it impresses with its approach. First, sociologists write that they decided to take those who answered their surveys in the previous year, call them with the next survey, and then compare the socio-demographic characteristics of those who answered and those who did not. In other words, not those who do not communicate with sociologists at all were measured, but only those who usually answer, but ignored them this particular year. Secondly, sociologists measured RR as 30%, but – again – this is the proportion of responses from those who were already in their database, that is, from those who usually talk to sociologists. Even if there are only 30% of these people, then from a random sample there may be much fewer of them, that is, the same 5–7% that the Russian field observes. Finally, this study does nothing to disprove the hypothesis that Russians evade answers because they do not trust the state and are afraid of sociologists. For that matter, it rather confirms it, since, according to this study, the highest abandonment rate is observed among young Russians (under 24 years old), namely, they are the most opposition in Russia.
Polls during the war
For a long time, few people were interested in opinion polls in Russia: there have been no free elections in the country for many years, and journalists remembered public opinion only at moments when mass protests broke out in the country. But after the start of a full-scale invasion of Ukraine, all the world's media hastened to inquire about what "ordinary Russians" think about it. It was then that the Levada Center triumphantly entered the stage with figures from which the nationwide support for the course of the party and government came. However, questions remained: if the war is so popular, then why do so few Russians hang symbols with the letters Z or V on their cars, why volunteers hardly come to the front, why no one comes out to protest in support of the war, if these actions are not gathered administration on the order, why are even the once loyal and built-in show business stars from Alla Pugacheva and Maxim Galkin to Joseph Prigogine and Meladze speaking out against the war? But the Western media continued and continues today to uncritically quote Levada's figures of support for the war, accompanying this with the words that, they say, even independent sociologists confirm this statistic. Perhaps that is why the Levada Center still remains a “desirable organization” and was not included in the list of “foreign agents”, like the vast majority of civil society institutions.
In reality, independent sociologists notice anomalies that indicate that the published results of polls about the war are distorted. Sociologist Elena Koneva, founder of the independent research agency ExtremeScan and participant in the Chronicles project, explains:
“The first February results of street polls, when 58% of support for the war was recorded, were a shock to us. It seemed that this was simply impossible. We began looking for an explanation for this, suggesting that it might be related to a socially desirable response. When the law on fakes came out a week later, we knew for sure that this could distort the results of our surveys. Then we decided to conduct a series of methodological experiments. We've added 'difficult to answer' and 'don't want to answer this question' response options to the wording of the main support survey. This seemingly minor change actually had a significant impact on the results.
Throughout our study, about a third of respondents (31%) find it difficult or refuse to answer a direct question about support for the war, which looks like an abnormally high proportion in relation to the defining issue in the life of the country. The maximum share of those who evaded a specific answer (36%) falls on September 29–30, 2022, that is, immediately after the announcement of “partial” mobilization. Throughout the year, 10-13% of respondents answered verbally “I don't support it”. The data show that this figure is actually several times higher due to those who, for security reasons, have gone into the position of “I don’t want to answer this question.”
I would also like to warn against getting carried away with superficial numbers. Even if we receive above 50% during conflicts, this does not mean real support. It is important to understand that by "support" people can mean different things. If we talk about my current understanding, I would argue that the real support for the war is about 35% and also at least 30% of Russians are real opponents of it. By real support, I mean cases where people can support their position with additional arguments or a willingness to do something, for example, fight, give money, etc.”
The fears that Russians do not want to talk sincerely with sociologists are not at all hypothetical. There have already been examples of punishment for “wrong” answers in opinion polls, recalls sociologist Grigory Yudin:
“A simple example. Deutsche Welle is conducting a survey on the streets of Moscow: how do you feel about the supply of Leopards? Most, quite unexpectedly for this very Deutsche Welle, say: “They deliver correctly, it’s high time.” This is shown on YouTube, and criminal cases are opened against these people. After that, Aleksey Levinson <Head of the Department of Socio-Cultural Research at the Levada Center — The Insider> says: "No one is afraid." Well, how can this be? How can this be taken seriously? We have every reason to believe that the samples are biased.”
Daria Pavlova from Russian Field also doesn't believe Levada's claims that the war didn't affect the sincerity of the responses:
“Levada at the beginning of the war said that their reach had not fallen at all and that everything was fine. We heard the opposite: people began to open their doors much less often. In general, it is hard to believe that every fourth apartment opens and answers questions. How the situation really is, it would be interesting to see. Fixing refusals on the streets and according to apartment surveys is very difficult, because the human factor interferes strongly here. If the computer fixes the reset on the phone, then every interviewer on the street should do it. So you work all day, walk down the street, everyone sends you and swears at you, and after each message you also need to note that the person refused – it’s easier not to do this. We tried to count failures in different ways – clickers (a device that makes a clicking sound when used) were given, for example, but we saw that different interviewers had different failures. It happened that two refusals a day, but how can this be? To control that a person really clicks at the moment of refusal, and not just polled, and then called … We need to look for some technical methods, this is not very convenient. You could say that our interviewers are so great, they check all the rejections and do all the methodologies and control everything, but it will be difficult to check that.”
Society without opinion
The main problem with polls in Russia lies not so much in how they are conducted, but in the fact that they are conducted at all, according to a number of independent sociologists. After all, the survey is based on the hypothesis that people have some formed opinion on key political topics, but a significant part of the population does not have and does not want to have any definite opinion on issues over which, as they believe, they have no influence. Grigory Yudin believes that this indifference as a product of learned helplessness is an even more important factor than fear:
“It's not just about fear, there is no such thing that everyone is directly terribly intimidated. But Russia is not divided into people who are for Putin and against Putin. This is something that is very difficult to understand and what polls do not allow to see: a significant part of Russians, by and large, do not care. And this is our main problem – not that people are for Putin, but that people do not care. That is why the war is now on. This paragraph does not appear anywhere by and large, and the paragraph “I find it difficult to answer” does not replace it, as you understand. Therefore, people who do not care begin to invent something on the spot. They start: “Well, since I agreed to answer, then I have something to do with a special military operation. Which? Putin probably knows what he's doing. I probably support it, yes, but why?“ And so on. As a rule, it's not that everyone hates Putin and then lies in the polls. People don't lie, they just start inventing on the spot. Какие-то идеологические противники Путина — да, скорее всего, наверное, будут говорить, что они его противники. Но в России их не очень много. Как и сторонников. В основном людям все равно».
Социолог Дарья Павлова соглашается: большая, а возможно и самая большая группа населения — это не те, кто за или против, а те, кто «не хотят об этом думать»:
«Мнение о том, что у нас диктатура, война и фашизм, разделяют совсем не все россияне. Это не потому, что им сказали, что так нужно считать, а потому, что люди просто не привыкли и не хотят об этом задумываться, смотрят телевизор и верят пропаганде и так далее. Может быть множество причин, почему люди не анализируют происходящее. Те, кто понимает, говорят: „Вот они хотят свободы, но ее нет, и поэтому они отвечают, что им все прекрасно“. На самом деле наша аналитика показывает, что самая большая группа — это те, кто просто не хотят об этом думать, чтобы их это не касалось, они не за и не против. Мы не можем понять, насколько люди искренне поддерживают или не поддерживают, потому что они говорят: „Да что вы ко мне пристали со своими вопросами? Я хочу пойти и поросят своих кормить“. Такой ответ на самом деле был от одного из респондентов: „У меня два поросенка розовых. Зачем вы мне задаете такие вопросы? Я лучше их пойду покормлю“».
Российские власти активно используют аполитичность и индифферентность россиян. Кремлю важнее убедить россиянина не в том, что Путин все делает правильно, а в том, что все окружающие этого россиянина люди поддерживают Путина, — этого вполне достаточно, чтобы россиянин не думал о протестах. Именно в этом и состоит главная угроза публикации опросов «Левады», считают независимые социологи: ведь эти мифические 80% поддержки, о которых все говорят, помогают Кремлю формировать лояльность населения. «Их никто не покупал, это идеологическая проблема. В это сложно поверить, но идеологически между их демофобным дискурсом и демофобным дискурсом Кремля нет никакой разницы. Они верят в одно и то же: в жуткий, страшный народ-стаю. В это верит Путин, в это верит Лев Гудков <директор „ Левада-центра “ — The insider >, они все в это верят и делают разные выводы. Путин говорит: „Ну, значит нужно пользоваться этим и устраивать империалистические войны“, а Лев Гудков говорит: „Значит, нужно плакать о том, какой у нас чудовищный народ“. Путин не будет плакать, у него есть другие дела, а Лев Гудков не будет поддерживать империалистические войны, но на уровне мировоззрения разницы вообще никакой нет», — отмечает Юдин.
Опрос в концлагере
Значит ли это, что вообще никакие вопросы на политические темы при воюющей диктатуре невозможны? Не значит, просто целью таких опросов и должно ставиться не выявление «мнения» тех или иных групп населения, а скорее обнаружение каких-то интересных тенденций или явлений, считает Григорий Юдин:
«Я очень люблю схему, которую ввели Russian Field: они стали задавать батарею из двух вопросов, каждый из которых начинается с того, что „если завтра Владимир Путин решит, что…“ — и дальше следуют два противоречащих друг другу решения, например, продолжить СВО или завершить СВО. И в обоих случаях подавляющее большинство поддерживает эти решения с разницей в несколько процентов, хотя они вроде бы должны исключать друг друга. Why? Потому что вопрос не в том, какое решение вы поддерживаете. Вопрос, который люди слышат, — это „поддерживаете ли вы Владимира Путина?“ — „Ну конечно, поддерживаем“.
Это самая главная опасность, потому что никакой „воли народа”, разумеется, эти опросы не выражают. Люди, которые отвечают на них, вообще их так не понимают. Они нечто другое слышат в этих опросах. По большому счету, вопрос звучит так: „Готов ли ты попереть против императора или не готов?“ Тут да, я, пожалуй, согласен, что процентов 80 скажут: „Нет, не готовы“. Ну а что, мы этого не знали, что ли? Для этого нужны были опросы? Мы это и так видим, тут никакой сенсации нет. Это самый неправильный способ восприятия этих опросов — читать их как результат плебисцита. В этом случае вы просто конструируете объект, которого нет.
Никакой «воли народа», разумеется, эти опросы не выражают
Это не значит, что опросы совсем уж бесполезны. Люди, которые разбираются, даже в этом могут найти какой-то смысл. Например, можно смотреть cross sections, то есть разбивки по разным показателям. Или можно смотреть динамику. Или разницу между разными группами — возрастными, экономическими и так далее. Но как раз это основные полстеры никогда не будут публиковать. Потому что это немедленно вызывает сомнения в монолитности всей картинки. И „Левада-центр“ тоже публикует это достаточно редко».
Помимо собственно опросов, есть и так называемые качественные социологические исследования. Они не могут выявить число сторонников тех или иных идей, но могут выявить логику и аргументацию тех или иных групп. В том числе и тех групп, которые поддерживают или не поддерживают войну, считает социолог Елена Конева:
«Само собой, высокой поддержки войны нет. Некоторые объясняют поддержку войны „возрождением“ имперского мышления. Они утверждают, что в людях пробуждаются ностальгия, желание воссоздать Советский Союз. Но мы этого не видим в своих исследованиях. Да, есть некая ностальгия, но очень мало людей, которые серьезно видят возможность и необходимость в этом.
Что же касается противников войны, то изначальная, довоенная неполитизированность основной массы сегодняшних противников войны позволяет лучше понять тревожный, депрессивный и как будто бы пассивный профиль этих людей. Они были настроены на нормальную, мирную жизнь, они работали на квалифицированной работе, растили детей. В условиях демократического государства они являлись бы нормальной, здоровой частью общества, средним классом, похожим на европейский, у которого протесты против терактов или повышения пенсионного возраста — лишь часть их обыденной жизни. Несмотря на их сегодняшнюю атомизированность и пессимизм в целом, эти люди имеют предпосылки осмысления страшной войны, которую развязала их страна. И когда это произойдет, тогда появится вероятность совершить шаг от молчаливой неподдержки к открытой и далее к доступным формам антивоенной активности».