How to use _load_transformers method in lisa

Best Python code snippet using lisa_python

preprocessing.py

Source: preprocessing.py Github

copy

Full Screen

...20 cm = ConfigManager()21 if mode == 'pred':22 expected_keys = ['transformers_path']23 self.config = cm.load_config(config_path, expected_keys)24 self.transformers = self._load_transformers(self.config)25 else:26 expected_keys = []27 self.config = cm.load_config(config_path, expected_keys)28 self.transformers = {29 'fillna_vals': {},30 'onehot_encoders': {},31 'count_corresp_tables': {},32 'minmax_scaler': None33 }34 def _load_transformers(self, config):35 """保存したログからtransformers 辞書を取得36 """37 prefix = '/​opt/​ml/​model'38 filename = Path(config['transformers_path']).name39 trans_path_for_pred = Path(prefix).joinpath(filename)40 transformers = joblib.load(trans_path_for_pred)41 expected_keys = [42 'fillna_vals', 'onehot_encoders', 'count_corresp_tables',43 'minmax_scaler'44 ]45 Utils.validate_dict(transformers, expected_keys)46 return transformers47 def save_transformers(self,48 dst_dir='./​.models',...

Full Screen

Full Screen

transformer.py

Source: transformer.py Github

copy

Full Screen

...117 f"'{item}'"118 )119 _sort_dfs(transformers, dependent, visited, sorted_transformers)120 sorted_transformers.append(transformer)121def _load_transformers(122 runbook_builder: RunbookBuilder,123 variables: Optional[Dict[str, VariableEntry]] = None,124) -> Dict[str, schema.Transformer]:125 transformers_data = runbook_builder.partial_resolve(126 partial_name=constants.TRANSFORMER, variables=variables127 )128 transformers = schema.load_by_type_many(schema.Transformer, transformers_data)129 return {x.name: x for x in transformers}130def _run_transformers(131 runbook_builder: RunbookBuilder,132 phase: str = constants.TRANSFORMER_PHASE_INIT,133) -> Dict[str, VariableEntry]:134 # resolve variables135 transformers_dict = _load_transformers(runbook_builder=runbook_builder)136 transformers_runbook = [x for x in transformers_dict.values()]137 # resort the runbooks, and it's used in real run138 transformers_runbook = _sort(transformers_runbook)139 copied_variables: Dict[str, VariableEntry] = dict()140 for value in runbook_builder.variables.values():141 copied_variables[value.name] = value.copy()142 factory = subclasses.Factory[Transformer](Transformer)143 for runbook in transformers_runbook:144 # load the original runbook to solve variables again.145 raw_transformers = _load_transformers(146 runbook_builder=runbook_builder, variables=copied_variables147 )148 runbook = raw_transformers[runbook.name]149 # if phase is empty, pick up all of them.150 if not runbook.enabled or (phase and runbook.phase != phase):151 continue152 derived_builder = runbook_builder.derive(copied_variables)153 transformer = factory.create_by_runbook(154 runbook=runbook, runbook_builder=derived_builder155 )156 transformer.initialize()157 values = transformer.run()158 merge_variables(copied_variables, values)159 return copied_variables...

Full Screen

Full Screen

model.py

Source: model.py Github

copy

Full Screen

...9class IceGRU:10 def __init__(self, model_path: Path, device: str = "cpu") -> None:11 self._model_path = model_path12 self.device = device13 self.transformers = self._load_transformers()14 self.model = self._load_model(self._model_path)15 self._n_seqs = len(self.__seq_vars)16 def predict(self, batch: List[Dict[str, np.ndarray]]) -> List[Dict[str, float]]:17 """Calculates predictions on a batch of data.18 The batch of data must be a list of dictionaries, where each dictionary contains the key-value pairs 19 - dom_x: a numpy-array of the x-coordinates of the event20 - dom_y: a numpy-array of the y-coordinates of the event21 - dom_z: a numpy-array of the z-coordinates of the event22 - dom_time: a numpy-array of the time-coordinates of the event23 - dom_charge: a numpy-array of the charge-values of the event24 - dom_atwd: a numpy-array with digitizer indicators (integers)25 - dom_pulse_width: a numpy-array of pulse widths of the event.26 The event is expected to be time-ordered.27 28 Args:29 batch (List[Dict[str, np.ndarray]]): A batch of event as described above30 Returns:31 List[Dict[str, float]]: Predictions for events32 """33 batch_list_transformed = self._dicts_to_arrays(self._transform_batch(batch))34 batch_packed_sequence, sequence_lengths, new_order = self._pad_sequence(35 batch_list_transformed36 )37 batch_packed = (batch_packed_sequence, sequence_lengths)38 prediction_transformed = self._predict(batch_packed)39 prediction = self._array_to_dicts(40 self._inverse_transform(prediction_transformed.numpy())41 )42 prediction_reordered = [43 e[0] for e in sorted(zip(prediction, new_order), key=lambda x: x[1])44 ]45 return prediction_reordered46 def _dict_to_array(self, event):47 n_doms = len(event[self.__seq_vars[0]])48 seq_arr = np.zeros((self._n_seqs, n_doms))49 for i_var, var in enumerate(self.__seq_vars):50 seq_arr[i_var, :] = event[var]51 return seq_arr52 def _dicts_to_arrays(self, batch):53 for i_event, event in enumerate(batch):54 batch[i_event] = self._dict_to_array(event)55 return batch56 def _inverse_transform(self, pred_array):57 for i_var, var in enumerate(self.__targets):58 transformer = self.transformers.get(var)59 pred = pred_array[:, i_var]60 if transformer:61 inv_transformed_pred = transformer.inverse_transform(62 pred.reshape(-1, 1)63 ).reshape(-1)64 pred_array[:, i_var] = inv_transformed_pred if transformer else pred65 return pred_array66 def _load_model(self, path):67 with open(Path.joinpath(path, "architecture_pars.json"), "r") as f:68 arch_pars = json.load(f)69 model = MakeModel(arch_pars)70 p = Path.joinpath(path, "model_weights.pth")71 model.load_state_dict(torch.load(p, map_location="cpu"))72 model.to(self.device)73 return model74 def _load_transformers(self):75 with open(self.__transformers_path, "rb") as f:76 transformers = pickle.load(f)77 return transformers78 def _pad_sequence(self, batch):79 indexed_batch = [(entry, i_entry) for i_entry, entry in enumerate(batch)]80 sorted_batch = sorted(indexed_batch, key=lambda x: x[0].shape[1], reverse=True)81 sequences = [torch.tensor(np.transpose(x[0])) for x in sorted_batch]82 indices = [x[1] for x in sorted_batch]83 sequence_lengths = torch.LongTensor([len(x) for x in sequences])84 sequences_padded = torch.nn.utils.rnn.pad_sequence(sequences, batch_first=True)85 return sequences_padded.float(), sequence_lengths, indices86 def _predict(self, batch):87 self.model.eval()88 with torch.no_grad():...

Full Screen

Full Screen

Blogs

Check out the latest blogs from LambdaTest on this topic:

The Art of Testing the Untestable

It’s strange to hear someone declare, “This can’t be tested.” In reply, I contend that everything can be tested. However, one must be pleased with the outcome of testing, which might include failure, financial loss, or personal injury. Could anything be tested when a claim is made with this understanding?

A Comprehensive Guide On JUnit 5 Extensions

JUnit is one of the most popular unit testing frameworks in the Java ecosystem. The JUnit 5 version (also known as Jupiter) contains many exciting innovations, including support for new features in Java 8 and above. However, many developers still prefer to use the JUnit 4 framework since certain features like parallel execution with JUnit 5 are still in the experimental phase.

Continuous Integration explained with jenkins deployment

Continuous integration is a coding philosophy and set of practices that encourage development teams to make small code changes and check them into a version control repository regularly. Most modern applications necessitate the development of code across multiple platforms and tools, so teams require a consistent mechanism for integrating and validating changes. Continuous integration creates an automated way for developers to build, package, and test their applications. A consistent integration process encourages developers to commit code changes more frequently, resulting in improved collaboration and code quality.

Putting Together a Testing Team

As part of one of my consulting efforts, I worked with a mid-sized company that was looking to move toward a more agile manner of developing software. As with any shift in work style, there is some bewilderment and, for some, considerable anxiety. People are being challenged to leave their comfort zones and embrace a continuously changing, dynamic working environment. And, dare I say it, testing may be the most ‘disturbed’ of the software roles in agile development.

A Reconsideration of Software Testing Metrics

There is just one area where each member of the software testing community has a distinct point of view! Metrics! This contentious issue sparks intense disputes, and most conversations finish with no definitive conclusion. It covers a wide range of topics: How can testing efforts be measured? What is the most effective technique to assess effectiveness? Which of the many components should be quantified? How can we measure the quality of our testing performance, among other things?

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run lisa automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful