Best Python code snippet using autotest_python
hdfdata.py
Source:hdfdata.py
1import h5py2import numpy as np3from .evtdata import Event4class HDFDataFile(object):5 """Interface class to AT-TPC data stored in HDF5 files.6 This can be used to read full GET events from an HDF5 file. This should *not* be used to read the peaks-only files,7 as those are in a different format.8 **WARNING:** If you open a file in `'w'` mode, all data in that file will be deleted! Open in `'a'` mode to append.9 It is **very** important to close the file safely if you have it open for writing. If you do not close the file10 before your code ends, it might become corrupted. The safest way to use this class is with a context manager, like11 this::12 with pytpc.HDFDataFile('/path/to/file.h5', 'r') as hfile:13 # work with the file14 evt = h5file[0] # for example15 # The rest of your code, which doesn't need the file, follows after the 'with' statement16 Once you leave the scope of the `with` statement, the file will be automatically closed safely.17 The data is stored in the HDF5 files as one table per event. The tables are stored in the group identified by18 `get_group_name`, and each dataset is named using the event ID. For example, the data for event 4 is stored in a19 table at `/get/4` by default since the default `get_group_name` is `'/get'`.20 In the data tables, rows correspond to channels and columns contain metadata and the digitized traces. This means21 that the shape of each table is N x 517, where N is the number of channels hit in that event and 517 equals 51222 time buckets plus 5 more values to identify the CoBo, AsAd, AGET, channel, and pad number. Each row basically23 looks like this:24 +------+------+------+---------+-----+------+------+------+-----+--------+25 | CoBo | AsAd | AGET | Channel | Pad | TB 1 | TB 2 | TB 3 | ... | TB 511 |26 +------+------+------+---------+-----+------+------+------+-----+--------+27 Then again, if you use this class, you won't really need to worry about those details since it's all taken care of28 already!29 Parameters30 ----------31 fp : string32 The path to the HDF5 file.33 open_mode : string, optional34 An open mode for the file, e.g. 'r' for read-only, 'a' for read/append, or 'w' for truncate and write. Use any35 valid open mode for an h5py file object.36 get_group_name : string, optional37 The name of the group in the HDF5 file containing GET events. Usually just keep the default value of '/get'.38 canonical_evtid_path : str, optional39 Path to an HDF5 file containing a table mapping a canonical event ID to the event ID for each CoBo. This40 can be used to correct for CoBos that dropped triggers.41 canonical_evtid_key : str, optional42 The key to read in the canonical event ID file. The default value is 'canonical_evtids'.43 """44 def __init__(self, fp, open_mode='r', get_group_name='/get', canonical_evtid_path=None,45 canonical_evtid_key='canonical_evtids'):46 self.fp = h5py.File(fp, mode=open_mode)47 self.get_group_name = get_group_name48 if canonical_evtid_path is not None:49 with h5py.File(canonical_evtid_path, 'r') as hf:50 self.canonical_evtid_table = hf[canonical_evtid_key][:]51 else:52 self.canonical_evtid_table = None53 self.fp.require_group(self.get_group_name)54 def close(self):55 """Close the file if it is open."""56 try:57 self.fp.close()58 except ValueError:59 pass60 @staticmethod61 def _unpack_get_event(evtid, timestamp, data):62 """Unpack the data table from the HDF file, and create an Event object.63 Parameters64 ----------65 evtid : int66 The event ID. This will be set in the resulting Event object.67 timestamp : int68 The timestamp value, which will also be set in the resulting Event object.69 data : array-like (or h5py Dataset)70 The data. The shape should be (N, 517) where N is the number of nonzero traces and 517 represents the71 columns (cobo, asad, aget, channel, pad, tb1, tb2, tb3, ..., tb511) from the data.72 Returns73 -------74 evt : pytpc.Event75 The Event object, as read from the file.76 """77 evt = Event(evtid, timestamp)78 evt.traces = np.zeros(data.shape[0], dtype=evt.dt)79 evt.traces['cobo'] = data[:, 0]80 evt.traces['asad'] = data[:, 1]81 evt.traces['aget'] = data[:, 2]82 evt.traces['channel'] = data[:, 3]83 evt.traces['pad'] = data[:, 4]84 evt.traces['data'] = data[:, 5:]85 return evt86 @staticmethod87 def _pack_get_event(evt):88 """Pack the provided event into a table for storage in the HDF5 file.89 Parameters90 ----------91 evt : pytpc.Event92 The event object to pack.93 Returns94 -------95 np.array96 The data from the event. The shape is (N, 517) where N is the number of nonzero traces and 517 represents97 the columns (cobo, asad, aget, channel, pad, tb1, tb2, tb3, ..., tb511).98 """99 t = evt.traces100 packed = np.zeros((len(t), 517), dtype='int16')101 packed[:, 0] = t['cobo']102 packed[:, 1] = t['asad']103 packed[:, 2] = t['aget']104 packed[:, 3] = t['channel']105 packed[:, 4] = t['pad']106 packed[:, 5:] = t['data']107 return packed108 def read_get_event(self, i):109 """Read the event identified by `i` from the file.110 This is the function to call to get an event from the HDF5 file. It will read the event from the file and return111 a pytpc Event object. The function will look for the event in the HDF5 group listed in `self.get_group_name`,112 so that must be set correctly on initializing the `HDFDataFile` object.113 Parameters114 ----------115 i : int (or str-like see below)116 The identifier for the event in the file. The code will look for the event at `/get/i` in the file if `/get`117 is the `get_group_name` set on the `HDFDataFile` object. In principle, this identifier `i` should be an118 integer representing the event ID, but it could actually be anything that can be cast to a str.119 Returns120 -------121 pytpc.Event122 The event, as rebuilt from the data stored in the file.123 Raises124 ------125 KeyError126 If there is no dataset called `i` in the HDF5 group `self.get_group_name` (i.e. the event ID provided was127 not valid).128 """129 gp = self.fp[self.get_group_name]130 if self.canonical_evtid_table is not None:131 # Use the canonical event ID table to correct for a dropped trigger132 # in some CoBo. Read fragments of each required CoBo event, and then133 # reconstruct the full event.134 cobo_evt_ids = self.canonical_evtid_table[i]135 cobo_evts = {n: gp[str(n)][:] for n in np.unique(cobo_evt_ids)}136 evt_chunks = []137 for cobo_id, cobo_evt_id in enumerate(cobo_evt_ids):138 cobo_evt = cobo_evts[cobo_evt_id]139 chunk = cobo_evt[np.where(cobo_evt[:, 0] == cobo_id)]140 evt_chunks.append(chunk)141 rawevt = np.concatenate(evt_chunks, axis=0)142 evt_id = i143 timestamp = 0 # FIXME: make real?144 else:145 # There is no canonical event ID table, so just read the event.146 ds = gp[str(i)]147 evt_id = ds.attrs.get('evt_id', 0)148 timestamp = ds.attrs.get('timestamp', 0)149 rawevt = ds[:]150 return self._unpack_get_event(evt_id, timestamp, rawevt)151 def write_get_event(self, evt):152 """Write the given Event object to the file.153 This function should be used to append an event to the HDF5 file. This will naturally only work if the file154 is opened in a writable mode.155 The generated dataset will be placed in the HDF5 file at `/get/evt_id` where `/get` is the `get_group_name`156 property of the `HDFDataFile` object and `evt_id` is the event ID property of the given Event object.157 The data is written with gzip compression and the HDF5 shuffle filter turned on. These filters are described158 in the HDF5 documentation. The format of the dataset is described in this class's documentation.159 Parameters160 ----------161 evt : pytpc.Event162 The event to write to the file.163 Raises164 ------165 RuntimeError166 If a dataset for this event already exists. This could also happen if the event ID is not unique in the run.167 """168 evt_id = evt.evt_id169 ts = evt.timestamp170 gp = self.fp[self.get_group_name]171 flat = self._pack_get_event(evt)172 ds = gp.create_dataset(str(evt_id), data=flat, compression='gzip', shuffle=True)173 ds.attrs['evt_id'] = evt_id174 ds.attrs['timestamp'] = ts175 self.fp.flush()176 def evtids(self):177 """Returns an iterator over the set of event IDs in the file, as integers.178 """179 return (int(key) for key in self.fp[self.get_group_name])180 def __len__(self):181 return len(self.fp[self.get_group_name])182 def __enter__(self):183 return self184 def __exit__(self, exc_type, exc_val, exc_tb):185 self.close()186 def __iter__(self):187 return (self.read_get_event(n) for n in self.fp[self.get_group_name])188 def __getitem__(self, i):...
parameter.py
Source:parameter.py
...19 def get_mutated_value(self, member):20 raise NotImplementedError()2122 def get_configuration(self, member):23 if not self.get_group_name():24 raise RuntimeError("Group not selected")25 return getattr(member.configuration, self.get_group_name())2627 def has_configuration(self, member):28 if not self.get_group_name():29 raise RuntimeError("Group not selected")30 return hasattr(member.configuration, self.get_group_name())3132 def get_value(self, member):33 return getattr(self.get_configuration(member), self.get_name())3435 def set_value(self, member, value):36 return setattr(self.get_configuration(member), self.get_name(), value)3738 def get_record_name(self):39 if not self.shared:40 if not self.get_group_name():41 raise RuntimeError("Group not selected")42 record_name = "%s_%s" % (self.get_group_name(), self.get_name())43 else:44 if not self.get_choice_name():45 raise RuntimeError("Choice not selected")46 record_name = "%s_%s" % (self.get_choice_name(), self.get_name())47 return record_name4849 def outline_simulation(self, simulation, outline):50 record_name = self.get_record_name()51 if not outline.has_attribute(record_name, DataType.Battle):52 outline.append_attribute(self.get_record_name(), DataType.Battle, [ Role.Parameter ], self.label)5354 def record_member(self, member, record):55 if self.has_configuration(member):56 setattr(record, self.get_record_name(), self.get_value(member))
...
consumers.py
Source:consumers.py
...4class ConsumerBase(AsyncJsonWebsocketConsumer):5 def get_query_params(self):6 query_string = self.scope['query_string'].decode()7 return dict(parse.parse_qsl(query_string))8 def get_group_name(self):9 raise NotImplementedError('You must create subclasses and implement this method')10 async def connect(self):11 await self.accept()12 await self.channel_layer.group_add(self.get_group_name(), self.channel_name)13 async def disconnect(self, code):14 await self.channel_layer.group_discard(self.get_group_name(), self.channel_name)15 async def group_message(self, event):16 await self.send_json(content=event['content'])17class ChatConsumer(ConsumerBase):18 def get_group_name(self):19 query_params = self.get_query_params()20 return 'chat-{}'.format(query_params.get('sala'))21 async def receive_json(self, content, **kwargs):22 await self.channel_layer.group_send(23 self.get_group_name(),24 {'type': 'group.message', 'content': content}25 )26class TestConsumer(ConsumerBase):27 def get_group_name(self):...
Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.
You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.
Get 100 minutes of automation test minutes FREE!!