Táto stránka zatiaľ nie je preložená do slovenčiny. Pozrieť si ju môžete v angličtine.
Whether comparing measurement data with Solargis datasets, performing quality control or analysing trends to extract bankable insights, Solargis’ data team works with huge volumes of data from a variety of sources daily. We spoke with Jozef Dudzak, Data Scientist, and Marketa Jansova PhD, Data Applications Team Leader, about the role of a Solar Data Specialist – and how the Solargis Analyst software transforms complex resource analysis in the solar sector.
Jozef: We receive raw solar measurements as CSV files from our customers, which need to be imported and harmonised for further analysis. This means aligning the data records with the metadata, resolving issues such as incorrect time stamps, and ensuring that the formatting is consistent. Next, we undertake quality control tasks, to prepare the data we receive for further analysis.
Good quality solar data is the cornerstone of trusted solar resource analysis. Customers understand this, but, while the industry is taking steps to improve measurement data quality, there is still a lack of consistency in the approaches taken. Part of our role is to verify the measured data thoroughly, ensuring that any errors are identified, and only valid data records are used during downstream analysis.
Once we have quality-controlled data, we analyse it and extract insights for customers. For project developers, our typical task is to use local ground measurements to adapt the Solargis model so that site-specific time series can be computed with reduced uncertainty. Our quality-assessed and harmonised solar measurements are also regularly used in the performance analysis of PV projects.
Jozef: We see that the most widely used data management and analysis tool is still Microsoft Excel. While it is an excellent tool for many data analysis tasks, it has limitations when processing large volumes of time series data which contain inconsistencies. Spreadsheet software is not designed for harmonisation, quality control and sophisticated analysis of solar resource time series data. When receiving measurement data, we often see inconsistent tables, shifted timestamps and lack of synchronicity. The typical spreadsheet software can only import 1 million records – which, depending on the timestep, might not be enough for 2 years, let alone 25 years of historical solar resource and meteorological time series data!
The next level of software sophistication is using custom scripts written in coding languages such as Python. This requires both coding expertise and solar expertise, so is not an accessible route for many businesses in the sector. Additionally, each element of data handling is controlled by scripts, making complex analysis and comparison with multiple datasets time consuming because the process is not interactive – a script needs to be coded, run and refined.
There are basic quality control services available, but they have limited functionality when it comes to delivering reliable and trustworthy quality-controlled data sets. Some solar experts use alternative software from other industries for basic management and visualization, yet struggle when advanced analysis is needed.
Marketa: Through our experience, handling solar data from around the world, we knew that the important step is bringing together different functionalities under the same roof. That means advanced quality control, intuitive visualisation, and streamlined data management, all in one place.
We set out to deliver a software platform built for solar analysts, by solar analysts, building our domain expertise into the fabric of the solution. We have already been able to automate labour-intensive tasks such as data harmonisation, quality control and much more besides.
The software streamlines the process of comparing multiple datasets and performing calculations, enabling analysts to work with complex, diverse datasets without needing to spend time creating new scripts.
Jozef: The visualisation tools add further flexibility and they save time – one can easily compare several data sets. There are lots of graphical tools to display the differences in various ways (heat maps, scatter plots, time series plots and so on).
Solargis Analyst also allows to pull together reports and summaries automatically, which means less time on administrative tasks – and more time spent delivering value to clients and stakeholders.
Marketa: We felt it was important to deliver a multi-level experience for users with different skills and needs. The platform provides functions and hints about what to look for, but if an analyst wants to calculate their own variable or data transformation, then Solargis Analyst offers the versatility to do this.
Jozef: Solargis Analyst can handle any type of data in Excel or CSV formats as long as it has a timestamp. The platform is optimised for a range of parameters, such as solar irradiation/irradiance, temperature, wind, precipitation, and other meteorological and atmospheric data.
The only limit to how much data you can use is a computer’s RAM. The largest dataset I have used to date contained 9 million records covering 25 years of data.
Marketa: The main goal is to expand existing functionality, while creating new functionality for data from a wider range of instrumentation and mounting types. We have also received positive feedback about the automation provided by the platform, so we will be looking to automate additional elements of the software to further simplify analysis and improve productivity.
Solar assets are becoming more complex. More data types are being used to sharpen the understanding the asset owners have of their project sites: from solar resource, temperature, albedo to precipitation. Solargis Analyst helps solar Data Analysts compare datasets easily and extract insights faster. To find out more about the new Solargis Analyst platform and explore how it can boost productivity for your solar data team, get in touch at https://solargis.com/about-us/contact