The EIMS project coordinator, Madeline Oosthuizen drafted this theme report. This was the consequence of receiving no bids to complete the study on this theme.
The findings include -
• Complying with the regulations: In assessing the quality of project-based tools, officials reviewing and the consultants preparing may over-emphasise that content is “present”. This may result in less emphasis being placed on what guidance the content may provide in decision making. The exercise can degenerate into a cataloguing of content, in which case, critical assessment of the merits of the projects and the serious consideration of the impact of the project on the receiving environment, appear to be secondary considerations.
• Link to sustainability goals: The guidelines require that “information can be linked to the broader goals and priorities of sustainable development in South Africa, and that it explains clearly how the proposed activity would add to or detract from such goals”. Without adequate baseline information, comprehensive goals and the linkage between strategic objectives and local projects, this requirement will continue to be poorly attended to.
• Voluminous vs Comprehensive and concise: What is required is the distilling all the relevant information, rather than providing copious volumes without analyses. Verbose reports add to public participation fatigue. There is a tendency to equate quality with volume, in which case EAPs may fail to give considered recommendations. There are complaints, especially from I&APs, that summaries are poorly drafted and omit information that places the proposed development in a negative light. Annexures could be better used. Specifically, conclusions and recommendation can be included in the main report, but the study details provided as annexures.
• Quality standards in the regulations are not enforced: For many of the tools comprehensive guidelines exist. The regulations also permit poor quality of reporting to be addressed by officials, yet quality control appears poorly enforced.
• Sustainable Development is not being achieved: Urgent action is required to direct the development path of the country towards sustainability – this is the concluding statement in the National Strategy and Action Plan for Sustainable Development. It conveys that sustainable development is not being achieved and that intervention is urgent. In relation to Sustainable Development, the Review found:
o EIA processes generally serve to motivate activities rather than assess whether or not activities should be permitted;
o EIA processes tend to generate mitigation measures rather than assess whether or not activities should be permitted;
o There is general ignorance amongst both officials and practitioners in respect to the sustainable development purpose of EIA. Sustainable development is seldom reflected deliberately and comprehensively in EIA documents.
o The biodiversity conservation imperative that is set by NEMA as a cornerstone of sustainable development is also usually not adequately reflected in EIA processes, especially in how the local site specific issues impact on the broader biodiversity context.
While EIA processes may meet the quality criteria (get all the boxes ticked), it often fails to make a real contribution to the quality of the decision that is made in the context of the specific area or sector within which it is made
• Indicators: The indicators for determining the effectiveness of instruments in achieving strategic or overarching goals, are not transparent, seldom referenced and rarely, if ever, monitored at a project level. There is no central repository to collect, collate and interpret data which drives the criteria for measuring for instance, sustainable development.
• Strategic tools are not being used effectively to inform projects: Spatial planning instruments can play a very important role in the avoidance of unnecessary impacts at especially local level. They should discourage applications in areas that are unsuitable. EMF is under-utilised as a strategic tool. It can be used screen applications which are inappropriate to specific geographical areas. The SEA is insufficiently used as a strategic tool. It can be used to screen projects to avoid continued expenditure – say on EIA – where a project is inappropriate or unsustainable.
• Tools address cumulative impacts poorly: Cumulative impacts are generally not considered effectively and that there is a lot of room for improvement in this respect. Cumulative effects assessment should be one of the basic information sources that informs EIA and SEA, since the synergism between issues within a cumulative impact effect may result in different outcome as opposed to assessing only individual impacts. However, cumulative effects are hard to assess at the level of project specific EIA, and therefore represent a compelling argument for the increased use of strategic level assessments.
• Professional registration of professionals and the quality of EIAs: Independence is provided for in the regulations, and there is recourse for any I&AP who consider the EAP to have a vested interest in the outcome of an application. Objectivity would require an application to present logical, verifiable and scientific information about a project, and to make reasoned recommendation (for or against). Interference by applicant/proponents in the assessment process often undermines the independence of practitioners and prevent the objective evaluation of EIA by officials.
• Authorisations are poorly drafted, aggravating poor enforcement: Conditions are not always appropriate or feasible, yet proponents choose not to appeal the conditions because the process is protracted. Developers implement the environmental management plan on the one hand and try to comply with the check list in the authorisation to avoid prosecution. Where these are contradictory, it leads to inefficiencies and makes a mockery of the legislation.
• Compliance enforcement is poor and under-staffed. There are insufficient EMIs at local authority level.
• Ambiguity of EMP: EMPs also tend to be vague on outcomes and tend to focus more on the definition of the input measures than the definition of output or outcome performance. There also appears to be little recognition of the legal status of the EMP on the projects reviewed. Many developers view the EMP as a guideline document rather than one that has legally enforceable provisions.
• Weak Enforcement: Competent authorities rarely, if ever, conduct inspections to ensure that the conditions of environmental authorisations are followed. There is an expectation of self-regulation that follows the issuing of an authorisation. This expectation presumes that once activities have been authorised -
o That their activities remain exactly as authorised;
o That their environmental profile is equivalent or better than what was used in the assessment process;
o That they diligently and robustly implement all the conditions of authorisation;
o That they report all incidents accurately and timeously; and,
o That they effectively report their environmental performance.
Compliance monitoring and ensuring the implementation of mitigation measures are not receiving adequate attention.
• Updating information systems: The varying quality of EIA applications affects good decision-making. Scientific information provided by the practitioners in the application is often inconclusive. Also, there are differing opinions about the validity and relevance of the information before the decision makers (internal and/or that presented by the proponent).
• Administrative fines: There has also been a call for the re-instatement of Environmental Courts. The Department of Justice has opined that insufficient cases exist to warrant the courts; lobbyists have countered that insufficient cases exists because of the difficulties of moving an environmental case through the normal channels. Some have proposed administrative fines as an alternative solution.
• Offsets: The practise of “offsets” is contentious. For areas of unique and irreplaceable biodiversity value, offsetting is neither possible nor appropriate. Proposed development projects, in this case, can be carried out on sites with lower biodiversity value complemented by compensation (or not carried out at all). There are three broad schools of thought:
o area alone (increasingly discredited);
o area and condition or quality of biodiversity (current best practice, of which many of US and German currencies are variants);
o and metrics of species’ populations and persistence
• Data availability and Scale: There are data gaps, different methodologies in use, access to information is often restricted, data collection is expensive as are storage, dissemination and maintenance.
Conclusions on the Quality of Tools
• Information requirements per tool, minimum criteria, and completeness of information within the tool are concerns addressed by regulations and guidelines. Generally, the inadequacy could be attributed to inaccessibility of guidelines or simply not knowing the guidelines exist. The guidelines and regulations will require amendment, to be re-focused, made centrally available and be the subject of an awareness campaign.
• For many of the tools comprehensive guidelines exist. The regulations also permit poor quality of reporting to be addressed by officials, yet quality control appears poorly enforced.
• It seems as if strategic planning tools are moving towards an outcome-approach in order to ensure the quality of the tool e.g. Sustainability goals, although it has not really been implemented
• However, it seems that the quality control of project based tools is still focused on content of reports rather than an outcome-approach.
• In order for an outcome-approach to be effective, the manner in which information is interpreted in report will have to be revised.
• It is envisaged that certain baseline data will always be required, and therefore a pre-determined level of background information may be standardised for replication at a project level. (It then becomes the task of the EAP to continually ground-truth the information and add quality data in standardised formats to an information repository.)
• As found in the Review, the EIA is the most frequently used tool, even when other tools may be better suited, more economical or more effective.