public class HardEM extends ExpectationMaximization
Modifier and Type | Field and Description |
---|---|
static boolean |
ADAGRAD_DEFAULT |
static String |
ADAGRAD_KEY
Key for Boolean property that indicates whether to use AdaGrad subgradient
scaling, the adaptive subgradient algorithm of
John Duchi, Elad Hazan, Yoram Singer (JMLR 2010).
|
static String |
CONFIG_PREFIX
Prefix of property keys used by this class.
|
static double |
MIN_SCALING_FACTOR |
emIteration, ITER_DEFAULT, ITER_KEY, iterations, tolerance, TOLERANCE_DEFAULT, TOLERANCE_KEY
AVERAGE_STEPS_DEFAULT, AVERAGE_STEPS_KEY, averageSteps, baseStepSize, CLIP_NEGATIVE_WEIGHTS_DEFAULT, CLIP_NEGATIVE_WEIGHTS_KEY, clipNegativeWeights, CUT_OBJECTIVE_DEFAULT, CUT_OBJECTIVE_KEY, cutObjective, inertia, INERTIA_DEFAULT, INERTIA_KEY, L1_REGULARIZATION_DEFAULT, L1_REGULARIZATION_KEY, l1Regularization, L2_REGULARIZATION_DEFAULT, L2_REGULARIZATION_KEY, l2Regularization, maxNumSteps, NUM_STEPS_DEFAULT, NUM_STEPS_KEY, numSteps, SCALE_GRADIENT_DEFAULT, SCALE_GRADIENT_KEY, SCALE_STEP_SIZE_DEFAULT, SCALE_STEP_SIZE_KEY, scaleGradient, scaleStepSize, STEP_SIZE_DEFAULT, STEP_SIZE_KEY, ZERO_INITIAL_WEIGHTS_DEFAULT, ZERO_INITIAL_WEIGHTS_KEY, zeroInitialWeights
allRules, atomManager, evaluator, EVALUATOR_DEFAULT, EVALUATOR_KEY, expectedIncompatibility, GROUND_RULE_STORE_DEFAULT, GROUND_RULE_STORE_KEY, groundRuleStore, inLatentMPEState, inMPEState, latentGroundRuleStore, latentTermStore, MAX_RANDOM_WEIGHT, MIN_ADMM_STEPS, mutableRules, observedDB, observedIncompatibility, RANDOM_WEIGHTS_DEFAULT, RANDOM_WEIGHTS_KEY, reasoner, REASONER_DEFAULT, REASONER_KEY, rvDB, supportsLatentVariables, TERM_GENERATOR_DEFAULT, TERM_GENERATOR_KEY, TERM_STORE_DEFAULT, TERM_STORE_KEY, termGenerator, termStore, trainingMap
Constructor and Description |
---|
HardEM(List<Rule> rules,
Database rvDB,
Database observedDB) |
HardEM(Model model,
Database rvDB,
Database observedDB) |
Modifier and Type | Method and Description |
---|---|
protected double[] |
computeScalingFactor()
Computes the amount to scale gradient for each rule.
|
doLearn, eStep, mStep
computeRegularizer, getLoss, setBudget
close, computeExpectedIncompatibility, computeLatentMPEState, computeLoss, computeMPEState, computeObservedIncompatibility, createAtomManager, getWLA, initGroundModel, initGroundModel, initGroundModel, initLatentGroundModel, learn, postInitGroundModel, setDefaultRandomVariables, setLabeledRandomVariables
public static final String CONFIG_PREFIX
public static final String ADAGRAD_KEY
public static final boolean ADAGRAD_DEFAULT
public static final double MIN_SCALING_FACTOR
protected double[] computeScalingFactor()
VotedPerceptron
computeScalingFactor
in class VotedPerceptron
Copyright © 2018 University of California, Santa Cruz. All rights reserved.