lucid icon indicating copy to clipboard operation
lucid copied to clipboard

Loading a model with tf.saved_model.loader

Open shancarter opened this issue 6 years ago • 4 comments

I was trying to import a custom model to use with lucid, and using the tf.saved_model loader. Is there support for importing a graph_def in this way, or am I out of luck?

Here's what I've tried so far.

with tf.Session() as sess:
  tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
  with tf.Graph().as_default() as graph:
    model = CustomModel()
    model.graph_def = tf.get_default_graph().as_graph_def()
    T = render.make_vis_T(model, "Mixed5b:1")
    tf.initialize_all_variables().run()
    for i in range(10):
      T("vis_op").run()
      showarray(T("input").eval()[0])

I get the following error:

ValueError: Cannot use the default session to execute operation: the operation's graph is different from the session's graph. Pass an explicit session to run(session=sess).

at this point in lucid:

render.py in make_vis_T(model, objective_f, param_f, optimizer, transforms, relu_gradient_override)
    169   global_step = tf.train.get_or_create_global_step()
    170   init_global_step = tf.variables_initializer([global_step])
--> 171   init_global_step.run()
    172 
    173   if relu_gradient_override:

shancarter avatar Mar 15 '19 21:03 shancarter

Hey Shan! It looks like you are loading the graph, but that it's being loaded into the wrong graph.

Could you try doing something like this:

# Create graph before session :)
with tf.Graph().as_default() as graph, tf.Session() as sess:
    tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
    model = CustomModel()
    model.graph_def = tf.get_default_graph().as_graph_def()
    T = render.make_vis_T(model, "Mixed5b:1")
    tf.initialize_all_variables().run()
    for i in range(10):
      T("vis_op").run()
      showarray(T("input").eval()[0])

colah avatar Mar 17 '19 20:03 colah

Thanks! Sadly when I try that sequence it throws a different error: FailedPreconditionError: Attempting to use uninitialized value at T("vis_op").run()

shancarter avatar Mar 18 '19 01:03 shancarter

Weird. Could you provide the full stack trace?

On Sun, Mar 17, 2019 at 6:15 PM Shan Carter [email protected] wrote:

Thanks! Sadly when I try that sequence it throws a different error: FailedPreconditionError: Attempting to use uninitialized value at T("vis_op").run()

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/151#issuecomment-473736727, or mute the thread https://github.com/notifications/unsubscribe-auth/AADw2s7Jg4-k1u_gMyoWkk7q6uYsmmRWks5vXujGgaJpZM4b3WoW .

colah avatar Mar 18 '19 01:03 colah

I think you still have 2 Graphs here. One is default graph and another one is created by CustomModel(). tf.initialize_all_variables().run() is only initializing the variables of default graph. Can try below!

# Create graph before session :)
with tf.Graph().as_default() as graph, tf.Session() as sess:
    tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
    model = CustomModel()
    model.graph_def = tf.get_default_graph().as_graph_def()
    T = render.make_vis_T(model, "Mixed5b:1")
    with T("vis_op").graph.as_default():
              # initialize your model's variables
               tf.initialize_all_variables().run()
    # this may not be required as lucid will not execute default graph but only the graph it has created
    tf.initialize_all_variables().run()
    for i in range(10):
      T("vis_op").run()
      showarray(T("input").eval()[0])

hegman12 avatar Mar 18 '19 15:03 hegman12