Crime prediction software launched in Dubai

Dubai Police has developed a crime prediction software. The technology, based on the aforementioned “advanced algorithms”, is similar to that of Space Imaging Middle East. Moreover, the data generated is extremely accurate, if the developers themselves say so.

Launch of crime prediction software in Dubai

In Minority Report, adapted from Philip K. Dick’s story, Tom Cruise plays a crime stopper who uses technology based on past information to punish the perpetrator of a crime until they commit the offence. Dubai police have used a similar technique.

The software is designed to continuously search police databases, connected with wireless and fibre broadband to internet, for similarities to crimes that have already been committed and to try to identify where and when a new offence might occur.

Space Imaging Middle East (SIME) representatives believe that this process can sometimes advise police authorities on which areas of a city need more services than crime prevention.

– The software is subjectively intelligent. It can accurately analyse complex patterns of criminal behaviour that are unrealistically associated with unrelated incidents and determine a rough estimate of whether a crime has been committed,” says EMIS’ Spandan Kar: “We believe that such accurate analyses, combined with police expertise, are a powerful weapon against criminals.

The researchers also used the data to identify areas where people’s prejudices affect policing. They looked at arrest rates in Chicago neighbourhoods of different socioeconomic levels. They found that crimes committed in wealthier areas resulted in more arrests than crimes committed in poorer areas, suggesting that police responses were biased. 

 Lawrence Sherman of the Centre for Evidence-Based Policing in Cambridge, UK, expressed concern that the study included both reactive and proactive policing. In other words, crimes recorded when people report a crime and crimes recorded when police actively pursue it. The latter type of data is particularly biased. According to Sherman, it “may reflect that the police are deliberately discriminating in certain areas”. 

 The personal biases that humans can unintentionally transfer to AI is a fascinating topic.

The software, the appropriately artificially animated mind, teaches itself. This type of solution is already in use worldwide. Artificial mental animation is already helping the tax police to stop financial crimes and fibre broadband connections speed up the process. Its algorithms are also being used to identify ISIS recruits by analysing social networks. Over time, as the tool evolves, the way it is applied will increase from time to time.

Taking crime data, AI can predict with 90 per cent accuracy where a law will be broken next week. However, there are concerns that similar systems in the police force could perpetuate biases, including racial bias. The researchers behind the AI algorithm say it could be used in reverse to detect such biases. 

 Ishanu Chattopadhyay of the University of Chicago and colleagues built an AI model that analysed 201 years of historical crime data.

-2016 Chicago, Illinois. As a result, the system was able to predict crime with high accuracy in the weeks immediately following the training period. The future of law enforcement technology is a fascinating and highly controversial topic.

 The model predicts the probability of a given crime rate across the entire city, divided into blocks approximately 300 metres in diameter. The AI predicted crime a week in advance with an incredible accuracy of 90 per cent. The model was also trained and tested on data from seven other major US cities. The same was true for all of them.

Law enforcement agencies demonstrate remarkable software capability. Uncovering murder, robbery or smuggling schemes can certainly impact their work.