Tag Archives: hauling

DataCloud bridging the mining industry’s data divide

DataCloud is looking to collect and merge the mining industry’s datasets through a cleaning, processing, integration, and predictive analytics platform that can help different stages of an operation prepare and plan for the ore and waste heading their way.

While the coarse ore stockpile may be the section of the flowsheet currently in DataCloud’s crosshairs – thanks to a well-attended webinar a few months back – any part of the mining process that is “between departments” could benefit from the MinePortal solution, according to Steven Putt, Director of Software Solutions for the company.

“The value case is inherent anywhere between departments – ie the stockpile is after crushing, but before the mill,” he told IM.

“The reason that stockpile is there – it tends to only be half a day or a day’s material – is it is a buffer for the mill,” Putt said. “Within this pile, one truck might have been hauling very hard material that the mill is exclusively treating for a week or so. Then, in accordance with the mine plan, this can switch to another truck and a new area of the mine, meaning the mill is going to have to adapt to a completely different material.”

The distinction between material in the coarse ore stockpile is often not this apparent; it tends to represent the mine site’s ‘melting pot’, taking in material from all over the operation.

Yet, to operate effectively, the mill needs to know the origins of the material coming its way ahead of time. The mill would then, ideally, be re-configured to treat the material.

“The mill operator would need to change the speeds of operation, the water balance, potentially the grinding media, etc,” Putt said. “Operators would typically prefer not to make those changes though, having the mill running at some ‘optimal’ speed based on the idea that the material is relatively consistent.”

The reality of the situation is different, as DataCloud and its MinePortal platform have been proving.

“The last client we worked with could end up saving around $20 million a year by carrying out our recommended processes as part of a wider mine to mill tracking solution,” Putt said of a copper-gold operation the company worked at. “Basically a specific rock type (skarn) was being fed into the mix too often and the mill was not prepared to handle this in the blend.”

This client turned out to be spending more money than necessary on its blasting process – using too much energy blasting the material to create a ‘uniform’ blend. But, in upping the amount of explosive used, it created sub-optimal crusher feed.

This saw the primary crusher assigned to treat material around 5 in (127 mm) in size attempting to ‘crush’ material that was averaging around 1 in in size, according to Putt.

The primary crushing process was ineffective to say the least.

By adapting the blasting process to target the designed-for primary crush size, reorienting the mine plan so not as much skarn material was being fed into the coarse ore stockpile at once, and adding steel ball media to the mill to deal with skarn that was fed into it, the headline savings were made, according to Putt.

Such savings come with quite a bit of due diligence work, he explains.

“It is not just about connecting disparate datasets; a tremendous amount of work goes into cleaning and contextualising the data – knowing which information is right for the project at hand and which data is not applicable,” Putt said of the MinePortal data gathering and analysis procedure.

Where other data-focused companies can clean datasets and put them into algorithms to form various predictions, DataCloud’s mining knowledge and deep collaboration with customers enables the company to create fit-for-purpose solutions that work in a practical sense on the mine site.

This process sees at least six months of relevant data required up front. Then, a four-week deep dive of this data is needed to find out if the existing dataset can solve production bottleneck issues. The US-based company normally then allocates another three months to kick off the solution, on-board all teams and see improvements come through, according to Putt.

“I wouldn’t say it is a complete customisation, but there does tend to be differences in place at every mine site we visit that means the MinePortal solutions are somewhat unique,” Putt said.

Coming back to the coarse ore stockpile example, Putt recommends hard-rock miners add another filter to their existing blending process to help improve results.

“It is about adding a mill risk factor to an existing grade control program; getting the engineers to plan the mining regime in a certain way to effectively prepare the mill for the material being fed into the coarse ore stockpile,” he said.

Miners can do this by obtaining a good idea of the time window in which the material delivered to the stockpile is entering the mill, enabling engineers to trace it back into the pit and analyse the properties that were observed – and captured – during the drill and blast process.

“This can be a tricky thing to do as the size of the stockpile is changing so often,” Putt says.

Some miners use RFID tags embedded in truck loads to get a rough idea on a weekly or monthly basis when the delivered material is finding its way into the mill, but few do this on a consistent basis.

MinePortal uses machine-learning algorithms the company has augmented for geology and mining needs to automate the process.

Using features such as dynamic time warping – which measures the similarity between two temporal sequences that may vary in speed timing differences – the platform is able to reconcile timing differences from dumping ore into a primary crusher, to sitting in a stockpile, and to when the ore goes through the rest of the mill.

Putt expands on this: “There is enough robust data within a mill’s database to run dynamic time warping, a machine-learning method, to compute the delays (of the material coming into the mill) as they change.

“We don’t need the timing of the delay to be consistent; we need the data to be recorded consistently so we can find the patterns of the delays from stage to stage. Running the data through machine learning will learn the rhythms of the stockpile and filter out inconsistencies.”

At the reconciliation stage, mining companies can pair the material signatures (rock hardness, for instance) with the results from the mill (energy draw, grind size, etc).

“Typically, we find there might be one or two specific blend types that are causing the issues,” Putt said. “From there, we can carry out real-time planning to improve the operation. We then have a feedback loop where you identify the problem feeds, change the blending over the next three months and then keep running through the process for continued improvements.”

But it all comes back to ore blending.

“The best way to handle the problem is from the ore blending point of view,” Putt said. “If you can get your ore blending to be spot on where it comes with the lowest risk of impacting the mill’s performance or availability, then the mill won’t have to do anything different (change speeds, adopt new grinding media, etc).

“You still have to dig, haul and send the material to the mill, but you are sending this material to the mill in different proportions.

“It comes with the same input costs; it just requires a bit of extra planning ahead of time to save a tonne of money in the mill.”