At the end of The Guardian’s Changing Media Summit, BBC technology reporter, Kate Russell, urged conferences attendants to keep the conversation going.

Such calls have become platitudes for some conferences less successful than others at accomplishing this aim. It’s not surprising, particularly when you return to the day job and the time to reflect dissipates with the next company pitch imminent.

So just how do you keep the conversation going, and which conversation? And what’s the incentive for keeping up a collective tweet count when conference memories decompose, supplanted by more pressing day to day issues.

What if, as the conference wrapped up, with ideas being refluxed out of their molecular structures looking for new bonds, there was an opportunity to refract and distill these ideas for a future summit?

At the very least, it’s a way of harnessing ideas in the room and some of the critical views we tend not to hear. Views that can manifest as critiques, which can often strengthen ideas as any pitch team may tell you.

Beyond this, does the effort serve any other purpose other than churning up ‘done-that’ material, when the next conference looms. It is reasonable to think what was showcased at the last gathering hardly constituted a critical debate on a subject. This isn’t a criticism. Conferences are about showcasing ideas.

However, a delegate at the Guardian Media Summit poignantly captured a tangible feeling with a question to a panel. It was something along the lines of he, the attendant, could hear a conversation amongst the panelists, but not much of a debate and discursive opinions.

If there are some things modern day conferences have not resolved it’s how they become less, “I’m talking to you, hence listen”, to “I’m talking with you, hence I’d like to hear what you have to say too”. So much for the conversation era.

As a senior university lecturer I face this as an everyday challenge. How to move the lecture space from a systemised production belt in which knowledge tends to flow one way, to a forum where there’s more symmetry in exchanges.

Yet, conference knowledge sourcing (CKS) has a piquant purpose. I have been attending and speaking at conferences since 1992 in South Africa, and on a more frequent basis since 2005 with Restoring the Trust (Austin Texas) Wemedia UK (2006), to more recently speaking at Apple and next week I’m at RTE mojo conference in Dublin.

I am a serial conference attendant, but for good reason — to find the unconventional, the artist, the methodology that sparks a new line of enquiry. I know myself enough now to know my first, second response to hearing something relies on my explicit thinking and that somewhere on eight and nine, I’m now into the idea.

The corollary of this has been falling into a reflective practice, whereby the researcher, in this case me, taps their experience to decipher phenomenon as problems or opportunities along a timeline. This isn’t unique. You do the same in some way.

The value of this explication is predicated on the practitioner’s experience. So if I say I have 28 years media experience, have worked for the likes of Channel 4 News, BBC, dotcoms etc. have trained countless media and won awards e.g. Knight Batten for Innovation in Journalism — that frames my experience.

It may also provide reason for you to understand how I might wring comparative knowledge from attending conferences. Really! The things some academics do.

From the heady days of we media, where internecine war all but existed between newer media and traditional, today we’ve reached a general consensus.

There exist an equilibrium between the force of ideas from yester years, from digital thinkers to those now calling themselves digital players. We’ve reached a general stasis in knowledge, which is naturally becoming conventionalised.

It happens all the time — from Edison’s light bulbs, Dyson’s new product range and now Twitter and the rest. This is not to deny innovation is still taking place, it’s just that at some point knowledge becomes somewhat saturated. It requires new sparks.

Weren’t you a little surprised at the low number of delegates wanting to ask questions of the back end of a talk? And yes there could be a myriad reasons why that is, but hands shooting up is nonetheless a visible barometer.

Is part of the problem that there’s not a lot more to squeeze from the idea or that in some observed cases future gazing is not just difficult, but a perilous pursuit? Marshall McLuhan said:

There are a lot of people busy predicting the future. I’ll leave them to it, the futurologists and certain sociologists, that’s their job, to look at the future. Historians take care of the past. I’ll tackle the really tough one: the present. Let me see if I can predict the present.

Arguably, there was enough legacy from the present conference to indeed keep the conversation going, but I’m back to why would you want to do that?

Take this example. Modernist storytellers generally use words today such as: ‘authentic’, ‘great stories’, ‘immersive’, ‘truth’, as if they’re new terms, and often cite traditional media elements to support them. Upworthy’s Peter Koechley did this.

But how do these labels which describe the impact of a film differ now compared to say Ed Murrow’s 1960's Harvest of Shame?

At least cinema makers acknowledge that what you do with a camera can alter the reality. It doesn’t neccessarily make the product better, just different.

So here’s my pitch to the Guardian — a 3 minute post by delegates or otherwise engaging one of the themes, with the result that the crowd becomes partly responsible for endorsing the ideas’ inclusion into the next summit.

Also, that the manner in which these ideas are delivered in a forum becomes more critical — a de facto commercialised thesis viva. I say commercialised because, the process doesn’t have to conform to the academic architectural procedure.

If anything, it’s a way of opening up the debate to the unrecognised talent, or otherwise peddle professional mischief. Either way, I thought I throw my two pence in, by posing the following. It goes like this.

How cinema will become an increasingly used craft skill to tell factual stories. And that Vice et al are just the tip of juggernaut of an iceberg. Various elements need framing. What do I mean by cinema, where’s the evidence and what are its implications? And, as I’ll explain it’s not a one size fits all. It requires creativity of a type that is culturally-bound.

That’s my next post.

Top Writer & Creative Technologist, Int. Award Winner. Cinemajournalist. Cardiff Uni @jomec. PhD (Dublin). Visiting Prof UBC, Ex BBC/C4News. Apple profiled.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store