Tools for the Battlefield of Information and Knowledge

Contributor: Alonso Hernández

Although today’s political discourse doesn’t exactly suffer from a surplus of rationality and common sense, it’s still important for organizers to have a solid command of the facts in order to build a good argument. 

Campaigns unfold over a series of confrontations in which it’s necessary to formulate and articulate responses, and motivate action. Here, there are a number of capacities that we need to develop, such as tactical intelligence, creativity, narrative ability, and even the effective use of humour (as Srdja Popovic brilliantly exposes in his book “Blueprint for Revolution”). It will certainly be challenging to successfully navigate the often-dirty attacks from the right - but if populism has fake news and fear as its main weapons, we progressives have the projection of hope and the domain of truth and reason.

For a successful campaign, it is necessary to have access to information and facts, build strong narratives, prepare convincing arguments, and be ready to counter difficult questions. This is no small effort. Progressive organisations usually lack sufficient resources to conduct large data research, so it is crucial to rely on innovative solutions that include good management of knowledge. Instead of hiring costly consultants, they can combine the use of digital tools with a little help from experts to be able to prepare a good campaign. 

One of the most difficult aspects of conducting research is the processing of data online - we need the right technology to build a robust knowledge base from which we will develop a campaign’s arguments. Luckily Octoparse and Parsehub, tools that we feature in this article, can be of great help in doing so.

"Parse" is the keyword, because it is what this software does, mainly - it browses, collects, and exports published content automatically.

Let's say we want to pull all of our adversary’s blog posts and classify them by topic in a spreadsheet to have a better view of how they approach issues and to be able to quote them quickly. Or maybe we want to browse various websites on a particular topic and download all the news relevant to our research. This is something that would have ordinarily required a coder to develop a small crawler that would specifically carry out these tasks - but these tools allow us to do so with just a few clicks. 

Octoparse and Parsehub allow us to easily generate journeys or workflows for the bots to collect content online. We can also add conditionals (for example, to filter out some blog posts on an irrelevant topic) so that the download is more targeted, and we can even search through private areas of the internet that require login and password like intranets or paid content online.

Thanks to the work of Octoparse and Parsehub, we will have spreadsheets or JSON files with a lot of useful information to be able to process, all without spending time on collecting it manually. And based on that information, we can proceed further, for example:

  • Using data visualization tools such as Tableau or Google Data Studio to better understand what has been collected,
  • Using ETL (Extract, Transform and Load) tools to incorporate the data into our database, such as SSIS Designer of Microsoft Visual Studio.
  • Or even using natural language processing tools, such as the Google Cloud Natural Language API, to have a more detailed analysis of what we have collected based on keywords, sentiment analysis, entities, etc. 

With these tools, we will be at a work level where our time can produce much more value.

Extra tools: We want to share this wonderful article that brings together a very interesting collection of tools for helping researchers, among other cool tools for content creators, all of which are super useful for organisers.