How to use flop method in fMBT

Best Python code snippet using fMBT_python

search_main.py

Source:search_main.py Github

copy

Full Screen

...48 top5.update (prec5.item(), base_inputs.size(0))49 # update the architecture50 arch_optimizer.zero_grad()51 logits, expected_flop = network(arch_inputs)52 flop_cur = network.module.get_flop('genotype', None, None)53 flop_loss, flop_loss_scale = get_flop_loss(expected_flop, flop_cur, flop_need, flop_tolerant)54 acls_loss = criterion(logits, arch_targets)55 arch_loss = acls_loss + flop_loss * flop_weight56 arch_loss.backward()57 arch_optimizer.step()58 59 # record60 arch_losses.update(arch_loss.item(), arch_inputs.size(0))61 arch_flop_losses.update(flop_loss_scale, arch_inputs.size(0))62 arch_cls_losses.update (acls_loss.item(), arch_inputs.size(0))63 64 # measure elapsed time65 batch_time.update(time.time() - end)66 end = time.time()...

Full Screen

Full Screen

search_main_v2.py

Source:search_main_v2.py Github

copy

Full Screen

...46 top5.update (prec5.item(), base_inputs.size(0))47 # update the architecture48 arch_optimizer.zero_grad()49 logits, expected_flop = network(arch_inputs)50 flop_cur = network.module.get_flop('genotype', None, None)51 flop_loss, flop_loss_scale = get_flop_loss(expected_flop, flop_cur, flop_need, flop_tolerant)52 acls_loss = criterion(logits, arch_targets)53 arch_loss = acls_loss + flop_loss * flop_weight54 arch_loss.backward()55 arch_optimizer.step()56 57 # record58 arch_losses.update(arch_loss.item(), arch_inputs.size(0))59 arch_flop_losses.update(flop_loss_scale, arch_inputs.size(0))60 arch_cls_losses.update (acls_loss.item(), arch_inputs.size(0))61 62 # measure elapsed time63 batch_time.update(time.time() - end)64 end = time.time()...

Full Screen

Full Screen

cifar_resnet_flop.py

Source:cifar_resnet_flop.py Github

copy

Full Screen

1def cifar_resnet_flop(layer=110, prune_rate=1):2 '''3 :param layer: the layer of Resnet for Cifar, including 110, 56, 32, 204 :param prune_rate: 1 means baseline5 :return: flop of the network6 '''7 flop = 08 channel = [16, 32, 64]9 width = [32, 16, 8]10 stage = int(layer / 3)11 for index in range(0, layer, 1):12 if index == 0: # first conv layer before block13 flop += channel[0] * width[0] * width[0] * 9 * 3 * prune_rate14 elif index in [1, 2]: # first block of first stage15 flop += channel[0] * width[0] * width[0] * 9 * channel[0] * (prune_rate ** 2)16 elif 2 < index <= stage: # other blocks of first stage17 if index % 2 != 0:18 # first layer of block, only output channal reduced, input channel remain the same19 flop += channel[0] * width[0] * width[0] * 9 * channel[0] * (prune_rate)20 elif index % 2 == 0:21 # second layer of block, both input and output channal reduced22 flop += channel[0] * width[0] * width[0] * 9 * channel[0] * (prune_rate ** 2)23 elif stage < index <= stage * 2: # second stage24 if index % 2 != 0:25 flop += channel[1] * width[1] * width[1] * 9 * channel[1] * (prune_rate)26 elif index % 2 == 0:27 flop += channel[1] * width[1] * width[1] * 9 * channel[1] * (prune_rate ** 2)28 elif stage * 2 < index <= stage * 3: # third stage29 if index % 2 != 0:30 flop += channel[2] * width[2] * width[2] * 9 * channel[2] * (prune_rate)31 elif index % 2 == 0:32 flop += channel[2] * width[2] * width[2] * 9 * channel[2] * (prune_rate ** 2)33 # offset for dimension change between blocks34 offset1 = channel[1] * width[1] * width[1] * 9 * channel[1] * prune_rate - channel[1] * width[1] * width[1] * 9 * \35 channel[0] * prune_rate36 offset2 = channel[2] * width[2] * width[2] * 9 * channel[2] * prune_rate - channel[2] * width[2] * width[2] * 9 * \37 channel[1] * prune_rate38 flop = flop - offset1 - offset239 # print(flop)40 return flop41def cal_cifar_resnet_flop(layer, prune_rate):42 '''43 :param layer: the layer of Resnet for Cifar, including 110, 56, 32, 2044 :param prune_rate: 1 means baseline45 :return:46 '''47 pruned_flop = cifar_resnet_flop(layer, prune_rate)48 baseline_flop = cifar_resnet_flop(layer, 1)49 print(50 "pruning rate of layer {:d} is {:.1f}, pruned FLOP is {:.0f}, "51 "baseline FLOP is {:.0f}, FLOP reduction rate is {:.4f}"52 .format(layer, prune_rate, pruned_flop, baseline_flop, 1 - pruned_flop / baseline_flop))53if __name__ == '__main__':54 layer_list = [110, 56, 32, 20]55 pruning_rate_list = [0.9, 0.8, 0.7]56 for layer in layer_list:57 for pruning_rate in pruning_rate_list:...

Full Screen

Full Screen

Automation Testing Tutorials

Learn to execute automation testing from scratch with LambdaTest Learning Hub. Right from setting up the prerequisites to run your first automation test, to following best practices and diving deeper into advanced test scenarios. LambdaTest Learning Hubs compile a list of step-by-step guides to help you be proficient with different test automation frameworks i.e. Selenium, Cypress, TestNG etc.

LambdaTest Learning Hubs:

YouTube

You could also refer to video tutorials over LambdaTest YouTube channel to get step by step demonstration from industry experts.

Run fMBT automation tests on LambdaTest cloud grid

Perform automation testing on 3000+ real desktop and mobile devices online.

Try LambdaTest Now !!

Get 100 minutes of automation test minutes FREE!!

Next-Gen App & Browser Testing Cloud

Was this article helpful?

Helpful

NotHelpful