Thursday, November 30, 2006

Why spam is still a problem

The European Commission has published a report into the growth of spam showing that it accounts for between 50 and 80 per cent of all e-mails, at a worldwide cost of €39 billion (£26 billion). According to its latest press release:

The new Communication on Spam acknowledges that legislative tools to fight these threats already exist, in particular the EU-wide “ban on spam” adopted in 2002 as part of the ePrivacy Directive (see IP/03/1015). However, implementation is still a problem in most EU Member States. To improve, they should now lay down clear lines of responsibility to use the tools available under EU law effectively. Because of the criminal trend in spam and its cross border aspects, good cooperation between enforcement authorities is paramount. In the Commission's view, spam fighters should have sufficient resources. The Dutch fall in spam was achieved through prosecutions by spam fighter OPTA, with just 5 full-time employees and €570,000 invested in equipment. Although we have the Directive on Privacy and Electronic Communications 2002/58/EC, more still needs to be done to tackle this problem.
See also:

Saturday, November 04, 2006

Surveillance Society

Whilst there has been much debate about living in a surveillance society (particularly as highlighted in the 28th International Conference on Data Potection Commissioners), and the strategies that could be adopted in regulating such surveillance, one of the interesting aspects that arose from the report (pdf) commissioned by the Information Commissioner is the idea of a privacy impact assessment test (PIA) and even a surveillance impact assessment test. There is quite a lot to digest from this report, but here is an excerpt on the PIA:

‘an assessment of any actual or potential effects that an activity or proposal may have on individual privacy and the ways in which any adverse effects may be mitigated’;
• ‘a process. The fact of going through this process and examining the options will bring forth a host of alternatives which may not otherwise have been considered’;
• an approach and a philosophy that holds promise by instilling a more effective culture of understanding and practice within organisations that process personal data;
• a form of risk-assessment, which therefore cannot escape the uncertainties of identifying and estimating the severity and likelihood of the various risks that may appear, to privacy, life-chances, discrimination equality and so on;
• a tool for opening up the proposed technologies or applications to in-depth scrutiny, debate and precautionary action within the organisation(s) involved;
• like PETs, premised on the view that it is better to build safeguards in than to bolt them on;
• an early-warning technique for decision-makers and operators of systems that process personal information, enabling them to understand and resolve conflicts between their aims and practices, and the required protection of privacy above or the control of surveillance;
• ideally, a public document, leading to gains in transparency and in the elevation of public awareness of surveillance issues and dangers may be realised; in turn, it may assist regulatory bodies in carrying out their work effectively.

A further point that should be added and noted in the report is that a PIA is not a compliance audit.


PIA should not be confused with compliance audits and the like, which are usually ex post facto and legally-oriented; as with environmental impact assessment, PIA assesses the likely impact of technology applications or new systems in the future, and considers a wider range of criteria.

For further reading, see pages 89 onwards in the report. Some countries such as Canada and Australia already have PIAs, but it remains to be seen whether PIAs will be adopted in the UK. See also: