May 17, 2020

DiamondCorp and the Lace Diamond Mine

diamond
mine
Diamond mining
Lace Diamond Mine
Dale Benton
4 min
DiamondCorp and the Lace Diamond Mine
The emerging diamond producer DiamondCorp, which owns and operates its flagship Lace Diamond Mine in South Africa, has announced that it will reopen the...

The emerging diamond producer DiamondCorp, which owns and operates its flagship Lace Diamond Mine in South Africa, has announced that it will reopen the mine following the suspension of production two weeks ago.

Operations at the mine were temporarily suspended due to heavy rains flooding the open pit mine but the company has now announced that the mine’s pumping system is now working on a normal basis.

The Lace Diamond Mine:

The Lace Diamond Mine is DiamondCorp’s primary asset in South Africa. The company was awared a mining right to the Lace property in February 2009.

Around 4.5 million tonnes of rock were extracted, with around 750,000 carats of diamond being recovered during a period of mining between 1901 and 1931 – the first mining to take place at the site.

Historically, Lace was known for its fancy pinks and purple diamonds, which had become a signature gem from this mine as well as white diamonds of above average quality.

More than 80 percent of the diamonds recovered at Lace Mine have been of gem quality.

The current treatment plant at the mine has a capacity of 1.2 mtpa.

Delineated resources will maintain the Lace mine for at least 25 years but the diamond pipe extends at depth and remains to be delineated. Annual production should exceed 500,000 carats per annum on occasions during that period. 

DiamondCorp:

DiamondCorp plc is focused on the re-development of the Lace Mine located near Kroonstad, South Africa. This historic mine is famed for its pink and purple fancies.

 The company is currently developing the Lace underground mine to access the kimberlite pipe.

Who’s who of DiamondCorp:

Chris Ellis

Independent Interim Non-Executive Chairman

Chris Ellis was born in 1963 and has a career in investment banking spanning over 20 years in London and the United States. For the last 9 years his principal activities have revolved around all financial aspects of the diamond supply chain, representing and advising industry banks, cutters & polishers, jewellery manufacturers and retailers. He is presently a Director of DFin Ltd, and was previously a principal of Consensus Advisors in London and the United States.

Paul Loudon

Chief Executive Officer

Paul Loudon was born in 1962 and has more than 20 years′ experience in stockbroking, corporate finance and management of junior mining and exploration companies. He has been President of Battlefields Minerals Corporation of Toronto and Chairman of BDI Mining Corp. He was head of Equities at Loeb Aron & Company Ltd, a London corporate finance house specialising in the mining sector, where he was responsible for raising considerable sums of equity capital for resource companies listed in the United Kingdom, Canada and Australia. Mr Loudon is an Associate of the Securities Institute and a member of the Southern African Institute of Mines and Metallurgy.

Jonathan Willis-Richards

Non-executive Director

Jonathan Willis-Richards was born in 1955 and holds a bachelor′s degree in Geology from Oxford University and a Master′s Degree in Mining Geology and PhD from Camborne School of Mines. He is internationally known for his PhD work on numerical modelling in the areas of fractured geothermal and mineralised systems, and is a founder and Managing Director of London mining corporate finance house Loeb Aron & Company Ltd.

Michael Toxvaerd

Non-executive Director

Michael Toxvaerd was born in 1974 and holds an MBA from Cranfield University. Currently CIO if HBG Management Partners Limited, he has significant experience in capital markets, mergers and acquisitions, founding, financing and listing companies on the London Stock Exchange. He is currently a non-executive director of European Islamic Bank plc and holds a number of other directorships.

Neil McDougall

Non-executive Director

Neil McDougall joins the board with many years of senior management experience in international environments.  Prior to joining Rasmala as Chief Financial Officer, Neil served in senior finance roles in various organisations including Europe Arab Bank Plc, Commerzbank and Standard Chartered. In his previous roles, he has been responsible for all aspects of Global Finance support, from performance management and decision making through to Product Control, Valuations Control and Financial and Regulatory Reporting. Mr McDougall's appointment is pursuant to the terms of the financing facility ("Facility") with Rasmala plc ("Rasmala") as announced on 20 October 2016.

 

The December issue of Mining Global is live!

Follow @MiningGlobal

Get in touch with our editor Dale Benton at [email protected]

 

Share article

May 16, 2021

McKinsey: adopting a smart approach to big data

McKinsey
Big Data
AI
Machine Learning
4 min
Industrial companies are using AI to improve plant operations. To be successful, they will need to leverage big data with the help of domain experts

Industrial companies are embracing artificial intelligence (AI) as part of the fourth digital revolution. AI leverages big data; it promises new insights that derive from applying machine learning to datasets with more variables, longer timescales, and higher granularity than ever.

Big Data

According to a new McKinsey report, using months or even years’ worth of information, analytics models can tease out efficient operating regimes based on controllable variables, such as pump speed, or disturbance variables, such as weather. These insights can be embedded into existing control systems, bundled into a separate advisory tool, or used for performance management.

Artificial Intelligence

McKinsey recommends that to succeed with AI, companies should leverage historical data via automation. They will need to adapt their big data into a form amenable to AI. This ‘smart data’ can improves predictive accuracy and support root-cause analysis. Additionally, bolstering and upskilling expert staff to manage the process can result in an EBITDA increase of 5 to 15%.

Smart Data

A common failure mode for companies looking to leverage AI is poor integration of operational expertise into the data-science process. McKinsey advocates applying machine learning only after process data have been analysed, enriched, and transformed with expert-driven data engineering using the following steps:

Define the process

Outline the steps of the process with experts and plant engineers, sketching out physical changes (such as grinding and heating) and chemical changes (such as oxidation and polymerization). Identify critical sensors and instruments, along with their maintenance dates, limits, units of measure, and whether they can be controlled.

Enrich the data

Raw process data nearly always contain deficiencies. Thus, creating a high-quality dataset should be the focus, rather than striving for the maximum number of observables for training. Teams should be aggressive in removing nonsteady-state information, such as the ramping up and down of equipment, along with data from unrelated plant configurations or operating regimes. Generic methods to treat missing or anomalous data should be avoided, such as imputing using averages, “clipping” to a maximum, or fitting to an assumed normal distribution. Instead, teams should start with the critical sensors identified by process experts and carefully address data gaps using virtual sensors and physically correct imputations.

Reduce the dimensionality

AI algorithms build a model by matching outputs, known as observables, to a set of inputs, known as features, which consist of raw sensor data or derivations thereof. Generally, the number of observables must greatly exceed the number of features to yield a generalized model. A common data-science approach is to engineer input combinations to produce new features. When combined with the sheer number of sensors available in modern plants, this necessitates a massive number of observations. Instead, teams should pare the features list to include only those inputs that describe the physical process, then apply deterministic equations to create features that intelligently combine sensor information (such as combining mass and flow to yield density). Often, this is an excellent way to reduce the dimensionality of and introduce relationships in the data, which minimize the number of observables required to adequately train a model.

Apply machine learning

Industrial processes can be characterized by deterministic and stochastic components. In practice, first principle–based features should provide the deterministic portion, with machine-learning models capturing the statistical portion from ancillary sensors and data. Teams should evaluate features by inspecting their importance and therefore their explanatory power. Ideally, expert-engineered features that capture, for example, the physics of the process should rank among the most important. Overall, the focus should be on creating models that drive plant improvement, as opposed to tuning a model to achieve the highest predictive accuracy. Teams should bear in mind that process data naturally exhibit high correlations. In some cases, model performance can appear excellent, but it is more important to isolate the causal components and controllable variables than to solely rely on correlations. Finally, errors in the underlying sensor data should be evaluated with respect to the objective function. It is not uncommon for data scientists to strive for higher model accuracy only to find that it is limited by sensor accuracy.

Implement and validate the models

Impact can be achieved only if models (or their findings) are implemented. Taking action is critical. Teams should continuously review model results with experts by examining important features to ensure they match the physical process, reviewing partial dependence plots (PDPs) to understand causality, and confirming what can actually be controlled. Additional meetings should be set up with operations colleagues to gauge what can be implemented and to agree on baseline performance. It is not uncommon for teams to convey model results in real time to operators in a control room or to engage in on-off testing before investing in production-grade, automated solutions.

Conclusion

Industrial companies are looking to AI to boost their plant operations—to reduce downtime, proactively schedule maintenance, improve product quality, and so on. However, achieving operational impact from AI is not easy. To be successful, these companies will need to engineer their big data to include knowledge of the operations (such as mass-balance or thermodynamic relationships). They will also need to form cross-functional data-science teams that include employees who are capable of bridging the gap between machine-learning approaches and process knowledge. Once these elements are combined with an agile way of working that advocates iterative improvement and a bias to implement findings, a true transformation can be achieved.

Share article