The large agricultural machinery manufactures have been getting greatly excited by the prospect of artificial intelligence (AI) and how it will alter the way we go about farming, encompassing everything from crop husbandry to self-driving tractors.
However, has this agitation been misplaced? Was there too much optimism shown, and can AI ever live up to what was expected of it?
The answers are looking ever less promising as the months roll by and few of the promises are fulfilled, and even when they are, there are large flaws in the product of AI.
Artificial intelligence
To understand quite why the buzz over this latest wave of computing may now slowly be deflating, it is important to understand just what AI actually is, and herein lies the first problem.
The definition of AI has no strict lines of demarcation, it is a rather woolly concept that is often used to impress, rather than enlighten, especially by company marketing departments.
Within computing circles, it is generally held that AI is the development of computers that will be capable of performing tasks that historically required human intelligence.
Computers are very good at doing lots of sums very quickly, what they have hitherto been unable to do is decide what sums need doing and why, and then doing something useful with the results.
The great hurdle
AI is meant to take the next major step and that is create computers that can reason, or think for themselves, and it is not looking as if that stage has been reached, despite programmes such as Chat GPT giving the illusion of doing so.
The concept of AI now stands in a sort of no mans land between advanced computing and true independence of human input, and even if the path out of this wilderness was clear, it is likely to be very expensive, so expensive in fact that adoption of true AI is likely to be very slow indeed.
Behind all the hubris being generated over artificial intelligence, there is some serious academic research going on with one paper in particular often being cited in the computing press as being a true indicator of the state of progress in the field of AI.
Published in January of this year, the MIT-based authors, M. Svanberg et al, look at one particular workplace task that is held to be suitable for the introduction of AI and that is computer vision.
The ability to recognise scenarios and then act upon them is is of huge interest to the agricultural world, although farming was only referenced in the study by noting its exclusion from the survey that sought to identify suitable tasks.
Yet the agricultural applications for this aspect of AI are far ranging, from autonomous tractors to weed recognition in the field, there is much that could be done to automate farming operations if only we could develop computers that can actually replicate human thought.
Prohibitive cost
However, despite the omission of farming, the study revealed some sobering facts concerning AI in industry generally. It is not the panacea that it is generally considered to be and its expense will severely slow its introduction.
The major problem is teaching it what to do. Huge amounts of data are required for machine learning and the more data the more accurate and useful AI becomes, but acquiring that information is costly and will consume a good deal of energy, a facet of computing which is now gaining critical attention.
The predictive model developed by the researchers was designed to forecast the cost/benefit ratio of developing and implementing AI systems to replace humans in work situations, and, naturally enough, the lower the pay scales the less attractive AI is to companies.
It was found that within American industry only 8% of menial jobs would be cost effective to replace with AI, A far cry from the general belief that we are all going to lose our jobs to it tomorrow.
As a very basic conclusion to the study, it would be fair to say that wholesale displacement of jobs by AI is still a long way off because of the shear cost involved in training it do the job, a deduction supported by the struggle that companies are having in making cars fully autonomous.
Another take away from the study is that often the lower paid jobs require a greater investment in AI to replace, driving trucks and tractors are hugely complex tasks for computers to take on while retail and healthcare jobs appear more susceptible to digitalisation.
Another path forward
It might then be asked if AI has any role in industry and agriculture at all, and this question is being addressed by others with the general consensus emerging that if AI is to be deployed then it will be in supporting humans performing tasks rather than replacing them.
The Nobel laureate Daron Acemoglu has been studying the impact of AI for several years now and he has come to believe that the AI crusade is chasing off down the wrong path.
Over the centuries machines have always been considered a means to replace workers and this is an attitude that needs to change if we are to make the most of what AI can now do he suggests.
AI, he said, is most likely to be applied to a bounded set of white-collar tasks, where large amounts of computational power can process a lot of inputs faster than humans can, which takes us straight back to the basic strength of computers – they are very good at big sums, but still cannot think for themselves.
Where AI is currently being deployed today is in trying to copy what humans do, but without the joy of inspiration or invention.
Generative AI may be very good at collecting and distilling large quantities of data but it cannot make decisions on whether its product has value at all, that remains subjective, and therefore a human judgement.
This may seem a long way way from everyday agriculture but until AI can gain ground in industries which may be better able to afford it farming, on the whole will not be greatly affected by it for many years yet.