nrgpy is the Python package for processing NRG Data Files

Overview

NRGPy nrgpy

nrgpy is the Python package for processing NRG Data Files

It provides tools for:

  • Converting binary ".rld" and ".rwd files to text format
    • using locally installed SymphoniePRO Desktop Software and Symphonie Data Retriever
    • using NRG Cloud API (compatible with Linux!) *(SymphoniePRO only at this time)
  • Reading Symphonie text exports into Pandas dataframes
  • Reading Spidar zip/csv files into Pandas dataframes
  • Timestamp adjustment (of text files)
  • Simple quality checks on data

Installation

pip install nrgpy

Examples

RLD files

Convert a folder of RLD files to Text with SymphoniePRO Desktop Software

import nrgpy
date_filter = '2018-10' # filter on any text in the filenames
text_folder_name = 'text_outputs/'
converter = nrgpy.local_rld(rld_dir='', out_dir=text_folder_name, file_filter=date_filter)
converter.convert()

Convert a folder of RLD files to Text with NRG Cloud Convert API

import nrgpy
file_filter = "000110"
rld_directory = "rlds"
client_id = "https://cloud.nrgsystems.com/data-manager/api-setup"
client_secret = ""
converter = nrgpy.cloud_convert(
    file_filter=file_filter, 
    rld_dir=rld_directory, 
    client_id=client_id,
    client_secret=client_secret,
    start_date="2019-11-01",
    end_date="2019-11-30",
)
converter.process()

Convert a single RLD file to Text with NRG Cloud API

import nrgpy
filename = "path/to/rld"
txt_dir = "path/to/txt/"
client_id = "https://cloud.nrgsystems.com/data-manager/api-setup"
client_secret = ""
converter = nrgpy.cloud_convert(
    filename=filename, 
    client_id=client_id,
    client_secret=client_secret,
)

Read files

file_filter = "000110"
import nrgpy

reader = nrgpy.sympro_txt_read()
reader.concat_txt(
    txt_dir=text_folder_name, 
    file_filter=file_filter, 
    start_date="2019-11-01",
    end_date="2019-11-30",
)

RWD files

Convert a folder of RWD files to Text with Symphonie Data Retriever

import nrgpy

file_filter = '0434201902' # for Feb 2019 files from site 0434
rwd_directory = 'C:/Users/[user]/rwd/'
out_directory = 'C:/Users/[user]/txt/'

converter = nrgpy.local_rwd(file_filter=file_filter, rwd_dir=rwd_directory, out_dir=out_directory)
converter.convert()

Convert a folder of RWD files to Text with Symphonie Data Retriever on Linux

import nrgpy

rwd_directory = '/home/user/datafiles/rwd'
out_directory = '/home/user/datafiles/txt'
wine_directory = '/home/user/prefix32/drive_c/' # path to wine's "C:\" drive


converter = nrgpy.local_rwd(
                file_filter=file_filter, 
                rwd_dir=rwd_directory, 
                out_dir=out_directory,
                wine_folder=wine_directory,
                use_site_file=False # set to True to use site files
            )
converter.convert()

You can also convert a single file with SDR, and save it in the same directory:

import nrgpy

filename = '/path/to/file'
converter = nrgpy.local_rwd(filename=filename)

Read TXT files exported from RWD files

import nrgpy

dt = 'rwd'
txt_file = '/path/to/file.txt'
reader = nrgpy.read_text_data(data_type=dt, filename=txt_file)

or concatenate a whole lot of files:

dt = 'rwd'
txt_dir = '/path/to/text/files'
file_filter = 'text_in_filenames_you_want'
reader = nrgpy.read_text_data(data_type=dt, txt_dir=txt_dir, file_filter=file_filter)
reader.concat()

Spidar files

Spidar Vertical Profiler remote sensors generate archived csv data files.

CSV archived in a Zip format.

These can be read directly into the spidar_txt_read method. See the docstring in spidar_txt.py for more information.

Eg.

In [1]: import nrgpy

In [2]: fname = "/spidar/1922AG7777_CAG70-SPPP-LPPP_NRG1_AVGWND_2019-07-07_1.zip"                            

In [3]: reader = nrgpy.spidar_data_read(filename=fname)                                                                              

In [4]: reader.heights                                                                                                         
Out[4]: [40, 60, 80, 90, 100, 120, 130, 160, 180, 200]

In [5]: reader.data                                                                                                            
Out[5]: 
                     pressure[mmHg]  temperature[C]  ...  dir_200_std[Deg]  wind_measure_200_quality[%]
Timestamp                                            ...                                               
2019-07-06 23:40:00          749.66           24.13  ...             28.77                           68
2019-07-06 23:50:00          749.63           24.08  ...             14.31                            0
2019-07-07 00:00:00          749.64           23.99  ...             20.59                            0
...
Comments
  • Error while converting RWD to txt

    Error while converting RWD to txt

    SDR test OK! Time elapsed: 0 s | 1 / 1 [=============================================] 100% Traceback (most recent call last): File "C:\Users\kenneth\AppData\Local\Continuum\Anaconda3\lib\site-packages\nrgpy\convert_rwd.py", line 294, in _copy_txt_file shutil.copy(txt_file_path, out_path) File "C:\Users\kenneth\AppData\Local\Continuum\Anaconda3\lib\shutil.py", line 241, in copy copyfile(src, dst, follow_symlinks=follow_symlinks) File "C:\Users\kenneth\AppData\Local\Continuum\Anaconda3\lib\shutil.py", line 120, in copyfile with open(src, 'rb') as fsrc: FileNotFoundError: [Errno 2] No such file or directory: 'C:\NRG\ScaledData\\018720120120200.txt'

    Unable to copy 018720120120200.txt to ScaledData\

    RWDs in : 1 TXTs out : 0 LOGs out : 0

    Difference : 1

    bug 
    opened by kenneth46k 18
  • Have you tried converting .rld files using nrgpy from RStudio?

    Have you tried converting .rld files using nrgpy from RStudio?

    I'm more an R & RStudio user. I've converted files using python but I've tried to convert files using Rmarkdown with the reticulate package but I can't. I run my code

    # R
    library(reticulate)
    # R 
    use_condaenv("Anaconda3")
    # Python chunk
    import nrgpy
    text_folder_name = '/archivos/txt' 
    converter = nrgpy.local_rld(rld_dir="/Data/rld", out_dir=text_folder_name)
    converter.convert()
    

    but when I get to the converter.convert() I get the message

    Converting 1446 files from \archivos\txt Saving outputs to \Data\rld Unable to process files in directory

    To convert files python connects to SymphoniePro Desktop and I think RStudio can't get access to do it. (I'm just guessing)

    Do you think it's possible to use python and nrgpy from RStudio? Or is it impossible?

    Please some advice.

    I use: python 3.8.5 with anaconda3 R x64 4.0.3 RStudio v1.3.1093 Windows 10 Home and Pro (at work)

    opened by cajasc 6
  •  convert_rld.local() method ONLY compatible with Windows OS.Please use convert_rld.nrg_convert_api() method instead.

    convert_rld.local() method ONLY compatible with Windows OS.Please use convert_rld.nrg_convert_api() method instead.

    #!/usr/bin/env python3
    
    import argparse
    import os
    import nrgpy
    
    parser = argparse.ArgumentParser(description='Use NRG SDR software to decode anemometer data.')
    parser.add_argument('--pin', help='PIN code', required=True)
    parser.add_argument('--rld-dir', help='RLD input data directory')
    parser.add_argument('--out-dir', help='TXT output data directory')
    parser.add_argument('--file-filter')
    parser.add_argument('--tower', help='Assume rld_dir is raw and out_dir is txt.')
    args = parser.parse_args()
    
    if args.tower:
    	args.rld_dir = f'/root/raw/{args.tower}/'
    	args.out_dir = f'/root/txt/{args.tower}/'
    	if not args.file_filter:
    		args.file_filter = args.tower
    else:
    	if not args.rld_dir or not args.out_dir or not args.file_filter:
    		print('[Error]: Option --rld-dir, --out-dir, --file-filter or --tower should be provided!')
    		exit(1)
    	args.rld_dir = f'/root/{args.rld_dir}/'
    	args.out_dir = f'/root/{args.out_dir}/'
    
    if not os.path.isdir(args.rld_dir):
    	print(f'[Error]: RWD directory {args.rld_dir} does not exist!')
    	exit(1)
    
    if not os.path.isdir(args.out_dir):
    	os.makedirs(args.out_dir)
    
    wine_dir = '/root/prefix32/drive_c' # path to wine's "C:\" drive
    
    converter = nrgpy.local_rld(
    	encryption_pin=args.pin,
    	file_filter=args.file_filter,
    	rld_dir=args.rld_dir,
    	out_dir=args.out_dir,
    	sympro_path=wine_dir,
    	use_site_file=False # set to True to use site files
    )
    converter.convert()
    
    print('Done')
    

    Screen Shot 2021-08-07 at 6 27 41 PM

    opened by peterwillcn 2
  • linux bash process rwd unable to copy C:\NRG\RawData\1238\20150812019.txt to root/txt/

    linux bash process rwd unable to copy C:\NRG\RawData\1238\20150812019.txt to root/txt/

    import nrgpy
    
    file_filter = '1238201508'
    rwd_directory = '/root/raw'
    out_directory = '/root/txt'
    wine_directory = '/root/prefix32/drive_c' # path to wine's "C:\" drive
    
    
    converter = nrgpy.local_rwd(
                    file_filter=file_filter,
                    rwd_dir=rwd_directory,
                    out_dir=out_directory,
                    wine_folder=wine_directory,
                    use_site_file=False # set to True to use site files
                )
    converter.convert()
    

    Screen Shot 2021-07-18 at 11 03 30 PM

    opened by peterwillcn 2
  • Is there a way to get the sensor change date based on configuration change?

    Is there a way to get the sensor change date based on configuration change?

    I can able to convert RLD files. Now I'm trying to find the sensor change date if configuration changes. Is there any trick to find the same. I can able to view sensor details in each text file. then I need to run a loop to read each file and try to figure out the change manually.

    With concat_txt function, it overwrites the sensor configuration with the latest text file.

    Regards Kenneth

    opened by kenneth46k 2
  • Different behaviors for single files and directories

    Different behaviors for single files and directories

    For both rld_local and rwd_local, if a single file is passed as filename in the init params, it is converted to txt and saved on disk. This behavior is confusing because as a user, I haven't called local_obj.convert() yet. On the other hand, if one passes in a directory instead of a single file through the rwd_dir or rld_dir init params, this requires calling .convert() on the local object for conversion.

    opened by sushinoya 2
  • Invalid files are also processed

    Invalid files are also processed "successfully"

    If a file does not exist, the output to the screen is still the same as when a valid file is processed by nrgpy_rld.local. Example -

    nrgpy_rld.local(filename="file_that_doesnt_exist.rld", out_dir="some_dir")
    

    Will print this to the console -

    file_that_doesnt_exist.rld ... 		[DONE]
    Queue processed
    

    Ideally, invalid paths or files should be reported differently to the user

    enhancement 
    opened by sushinoya 2
  • sympro_txt.sympro_txt_read.concat_txt: file_type not implemented

    sympro_txt.sympro_txt_read.concat_txt: file_type not implemented

       243     file_type : str                                                                             
       244         type of export (meas, event, comm, sample, etc...)
    
    opened by dave-c-vt 1
  • sympro_txt.py:154 replace .append with .concat

    sympro_txt.py:154 replace .append with .concat

    nrgpy/read/sympro_txt.py:154: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead. self.ch_info = self.ch_info.append(ch_list)

    opened by dave-c-vt 1
  • Call count_files with convert_time instead of start_time

    Call count_files with convert_time instead of start_time

    After rwd files are converted in nrgpy.convert_rwd.local.convert, there are steps to count the .txt files and the .log files. The count_files call for the .txt files uses self.convert_time (a float), but the one for .log files uses self.start_time (a datetime) - which fails. I changed it without giving it a lot of scrutiny, so you shouldn't blindly accept this...

    Another warning... I've only done one pull request - and that was years ago.

    (env) PS C:\Users\todd.koym\projects2\metl> metl-convert-rwd.exe .\etc\development.ini 
    SDR test OK!
    Converting    1/30   111420200325676.RWD  ...  [DONE]
    Unable to copy 111420200325676.txt to C:\Users\todd.koym\Documents\Renewable NRG Systems\Exports\
    Converting    2/30   111420200325678.RWD  ...  [DONE]
    Converting    3/30   111420200325680.RWD  ...  [DONE]
    Unable to copy 111420200325680.txt to C:\Users\todd.koym\Documents\Renewable NRG Systems\Exports\
    Converting    4/30   111420200325682.RWD  ...  [DONE]
    Unable to copy 111420200325682.txt to C:\Users\todd.koym\Documents\Renewable NRG Systems\Exports\
    Converting    6/30   130320200411684.RWD  ...  [DONE]
    Converting    7/30   130320200412685.RWD  ...  [DONE]
    Converting    8/30   210120200411184.RWD  ...  [DONE]
    Converting    9/30   210120200412185.RWD  ...  [DONE]
    Converting   10/30   240620200412818.RWD  ...  [DONE]
    Converting   11/30   400320200411085.RWD  ...  [DONE]
    Converting   12/30   400320200412086.RWD  ...  [DONE]
    Converting   13/30   400520200411082.RWD  ...  [DONE]
    Converting   14/30   400520200412083.RWD  ...  [DONE]
    Converting   15/30   470220200412070.RWD  ...  [DONE]
    Converting   16/30   490420200412827.RWD  ...  [DONE]
    Converting   17/30   573620200412356.RWD  ...  [DONE]
    Converting   18/30   573620200413357.RWD  ...  [DONE]
    Converting   19/30   600120200411249.RWD  ...  [DONE]
    Converting   20/30   600120200412250.RWD  ...  [DONE]
    Converting   21/30   637120200412033.RWD  ...  [DONE]
    Converting   22/30   637120200413034.RWD  ...  [DONE]
    Converting   23/30   850320200412030.RWD  ...  [DONE]
    Converting   24/30   850320200413031.RWD  ...  [DONE]
    Converting   25/30   905320200412304.RWD  ...  [DONE]
    Converting   26/30   905320200413305.RWD  ...  [DONE]
    Converting   27/30   981220200412829.RWD  ...  [DONE]
    Converting   28/30   990120200412795.RWD  ...  [DONE]
    Converting   29/30   990120200413796.RWD  ...  [DONE]
    Unable to copy 990120200413796.txt to C:\Users\todd.koym\Documents\Renewable NRG Systems\Exports\
    Converting   30/30   991020200412081.RWD  ...  [DONE]
    Traceback (most recent call last):
      File "C:\Users\todd.koym\projects2\metl\env\Scripts\metl-convert-rwd-script.py", line 11, in <module>
        load_entry_point('metl', 'console_scripts', 'metl-convert-rwd')()
      File "c:\users\todd.koym\projects2\metl\src\metl\cli.py", line 112, in convert_rwd
        converter.convert()
      File "c:\users\todd.koym\documents\github\nrgpy\nrgpy\convert_rwd.py", line 137, in convert
        log_count, log_files = count_files(self.out_dir, self.file_filter, 'log', show_files=True, start_time=self.start_time)
      File "c:\users\todd.koym\documents\github\nrgpy\nrgpy\utilities.py", line 65, in count_files
        if os.path.getmtime(os.path.join(dirpath,x)) > start_time:
    TypeError: '>' not supported between instances of 'float' and 'datetime.datetime'
    (env) PS C:\Users\todd.koym\projects2\metl>
    
    opened by tkoym 1
  • txt filtering for sympro_txt_read.concat_txt

    txt filtering for sympro_txt_read.concat_txt

    if txt files are among rlds, the sympro_txt_read.concat_txt method will fail. Need to add a check for valid txt exports.

    Interim fix would be to add exception handling for UnicodeDecodeError, which occurs when the original RLD file is in the same folder as the TXT export.

    opened by nrgpy 1
Releases(v1.8.4)
Owner
NRG Tech Services
NRG Tech Services
ETL pipeline on movie data using Python and postgreSQL

Movies-ETL ETL pipeline on movie data using Python and postgreSQL Overview This project consisted on a automated Extraction, Transformation and Load p

Juan Nicolas Serrano 0 Jul 07, 2021
Python package for analyzing sensor-collected human motion data

Python package for analyzing sensor-collected human motion data

Simon Ho 71 Nov 05, 2022
LynxKite: a complete graph data science platform for very large graphs and other datasets.

LynxKite is a complete graph data science platform for very large graphs and other datasets. It seamlessly combines the benefits of a friendly graphical interface and a powerful Python API.

124 Dec 14, 2022
Extract data from a wide range of Internet sources into a pandas DataFrame.

pandas-datareader Up to date remote data access for pandas, works for multiple versions of pandas. Installation Install using pip pip install pandas-d

Python for Data 2.5k Jan 09, 2023
Time ranges with python

timeranges Time ranges. Read the Docs Installation pip timeranges is available on pip: pip install timeranges GitHub You can also install the latest v

Micael Jarniac 2 Sep 01, 2022
PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams

PLStream: A Framework for Fast Polarity Labelling of Massive Data Streams Motivation When dataset freshness is critical, the annotating of high speed

4 Aug 02, 2022
Pyspark Spotify ETL

This is my first Data Engineering project, it extracts data from the user's recently played tracks using Spotify's API, transforms data and then loads it into Postgresql using SQLAlchemy engine. Data

16 Jun 09, 2022
Hangar is version control for tensor data. Commit, branch, merge, revert, and collaborate in the data-defined software era.

Overview docs tests package Hangar is version control for tensor data. Commit, branch, merge, revert, and collaborate in the data-defined software era

Tensorwerk 193 Nov 29, 2022
Pip install minimal-pandas-api-for-polars

Minimal Pandas API for Polars Install From PyPI: pip install minimal-pandas-api-for-polars Example Usage (see tests/test_minimal_pandas_api_for_polars

Austin Ray 6 Oct 16, 2022
Analyzing Covid-19 Outbreaks in Ontario

My group and I took Covid-19 outbreak statistics from ontario, and analyzed them to find different patterns and future predictions for the virus

Vishwaajeeth Kamalakkannan 0 Jan 20, 2022
A library to create multi-page Streamlit applications with ease.

A library to create multi-page Streamlit applications with ease.

Jackson Storm 107 Jan 04, 2023
This tool parses log data and allows to define analysis pipelines for anomaly detection.

logdata-anomaly-miner This tool parses log data and allows to define analysis pipelines for anomaly detection. It was designed to run the analysis wit

AECID 32 Nov 27, 2022
ETL flow framework based on Yaml configs in Python

ETL framework based on Yaml configs in Python A light framework for creating data streams. Setting up streams through configuration in the Yaml file.

Павел Максимов 18 Jul 06, 2022
HyperSpy is an open source Python library for the interactive analysis of multidimensional datasets

HyperSpy is an open source Python library for the interactive analysis of multidimensional datasets that can be described as multidimensional arrays o

HyperSpy 411 Dec 27, 2022
t-SNE and hierarchical clustering are popular methods of exploratory data analysis, particularly in biology.

tree-SNE t-SNE and hierarchical clustering are popular methods of exploratory data analysis, particularly in biology. Building on recent advances in s

Isaac Robinson 61 Nov 21, 2022
A multi-platform GUI for bit-based analysis, processing, and visualization

A multi-platform GUI for bit-based analysis, processing, and visualization

Mahlet 529 Dec 19, 2022
MDAnalysis is a Python library to analyze molecular dynamics simulations.

MDAnalysis Repository README [*] MDAnalysis is a Python library for the analysis of computer simulations of many-body systems at the molecular scale,

MDAnalysis 933 Dec 28, 2022
Reading streams of Twitter data, save them to Kafka, then process with Kafka Stream API and Spark Streaming

Using Streaming Twitter Data with Kafka and Spark Reading streams of Twitter data, publishing them to Kafka topic, process message using Kafka Stream

Rustam Zokirov 1 Dec 06, 2021
MetPy is a collection of tools in Python for reading, visualizing and performing calculations with weather data.

MetPy MetPy is a collection of tools in Python for reading, visualizing and performing calculations with weather data. MetPy follows semantic versioni

Unidata 971 Dec 25, 2022
The Master's in Data Science Program run by the Faculty of Mathematics and Information Science

The Master's in Data Science Program run by the Faculty of Mathematics and Information Science is among the first European programs in Data Science and is fully focused on data engineering and data a

Amir Ali 2 Jun 17, 2022